British Companies Utilizing “Bossware” to Monitor Employee Activities, According to Research

A significant portion of UK employers, about one-third, are utilizing “bossware” technology to monitor employee activities, predominantly through methods like email and web browsing surveillance.

Private sector firms are the most inclined to implement onsite monitoring, with one in seven employers reportedly recording or assessing screen activities, as per a comprehensive UK study on office surveillance.

These insights, disclosed by the Chartered Management Institute (CMI) to the Guardian, are derived from feedback from numerous UK managers, indicating a recent uptick in computer-based work monitoring.

According to 2023 research by the Information Commissioner’s Office (ICO), less than 20% of respondents believed they were being monitored by their employers. The finding that roughly one-third of managers are aware of their organizations tracking employees’ online activities on company devices likely underrepresents the issue.

Many of these surveillance tools are designed to mitigate insider threats, safeguard confidential data, and identify dips in productivity. However, this growing trend seems to be inducing anxiety among employees. CMI highlights that many managerial figures oppose such practices, arguing they erode trust and infringe on personal privacy.

A manager at an insurance firm developing an AI system for monitoring staff screen activity expressed feelings of “unease,” questioning, “Do they trust employees to perform their roles? Is there an intention to replace them with AI?”

One employee monitoring service provides insights into workers’ “idle hours,” tracks “employee productivity,” flags unapproved AI or social media use, and offers “real-time data on employee behavior, including screenshots, screen recordings, keystrokes, and application usage.”

In light of these findings, the ICO emphasized that employers “must inform employees about the nature, scope, and reasons for surveillance,” noting that excessive monitoring “can infringe on personal privacy,” especially for remote workers. They warned of potential actions if necessary.

Last year, the ICO prohibited outsourced company Serco from utilizing facial recognition technology and fingerprint scanning to manage staff attendance at various leisure centers.

Monitoring often includes ensuring that inappropriate content isn’t accessed, according to CMI. However, they cautioned, “If it feels like an invasion, there can be long-term implications.”

Petra Wilton, policy director at CMI, stated, “If implemented, this could be of significant concern to employers and raise serious data privacy and protection issues.”

Recent examples of workplace surveillance methods include: HSBC’s installation of numerous security cameras and 1,754 biometric readers as a means of accessing their new London headquarters.

Skip past newsletter promotions

PWC has recently rolled out a “traffic light” system utilizing badge swipes and WiFi connection data to ensure staff attend the office at least three days a week. A spokesperson from PWC noted this was “well received by most of our employees.”

A former senior public transport worker, who requested anonymity, shared their experience of facing online surveillance, describing it as “distracting and deeply intrusive.”

“It began with surveillance, and I eventually left because I was extremely frustrated,” they noted. CMI research revealed that one in six managers would contemplate seeking new employment if their organization started monitoring online activities on work devices.

Among managers aware of their employers monitoring them, 35% indicated surveillance of emails. Overall, tracking login/logout times and system access emerged as the most prevalent form of monitoring.

The survey showed that 53% of managers endorse monitoring employee online activity on company devices, but 42% feel this not only undermines trust but also fails to enhance performance, potentially resulting in misuse or unjust disciplinary action.

Source: www.theguardian.com

Manufacturers of Air Flyers and Smart Speakers Urged to Uphold User Privacy Rights

The UK’s Intelligence Commission (ICO) has urged manufacturers of air fryers, smart speakers, fertility trackers, and smart TVs to prioritize the privacy rights of individuals.

Many individuals feel overwhelmed by the lack of control over how data is gathered, utilized, and shared regarding their homes and personal lives.

Following concerns raised about an air fryer that reportedly listens to its environment, Public concerns have emerged about digital devices collecting an excessive amount of personal data, prompting data protection regulators to issue initial guidelines on the handling of personal information.


Manufacturers and data handlers are now required to ensure data security, maintain transparency with consumers, and routinely delete collected information.

“We are pleased to share our commitment to offering a diverse range of services to our clients,” stated Stephen Almond, executive director of ICO’s regulatory risks.

“These products are meant to enhance our lives, but that doesn’t imply they should collect unnecessary data… Users shouldn’t have to sacrifice their privacy to enjoy the advantages of smart devices.”

“Everyone expects a degree of privacy within their own homes, thus we need to trust that smart products will honor that privacy, act responsibly, and utilize information only as anticipated.”

The new guidance points out various devices classified as part of the “Internet of Things,” which collect data that must be managed with care. These items include smart fertility trackers that log user periods and temperature readings, transmitting that data back to manufacturer servers to determine fertile days.

Smart speakers should not only listen to their owners but also to other family members and visitors; they must be designed to empower users to adjust settings to minimize personal data collection.

Skip past newsletter promotions

Regulators emphasized that manufacturers must be clear with individuals regarding the use of their personal data, only collecting what is necessary, and enabling easy deletion of data from devices.

The ICO has stated to manufacturers that it is “prepared to take necessary actions to safeguard individuals from potential harm.”

Source: www.theguardian.com

How Did You Obtain My Number? Exploring the Underworld of Data Brokers | Privacy Issues

Priya Dev suspects she knows why political spam inundated her inbox during the 2025 federal election campaign.

Like many Australians, the developer endured an avalanche of unwanted patriotic text messages—Clive Palmer himself admitted to sending 17 million of them. However, it was the email spam from one of the major political parties that caught her attention.

Political parties are exempt from privacy laws, meaning they aren’t required to inform individuals about how to access their data, nor is there an option to opt out.

Nonetheless, clues surfaced for the Australian National University Data Science Academic. The email was sent to a pseudonym she used for an online purchase years ago—also employed in 2020 when she received spam from a minor political party.

“It appears to originate from a transaction,” she stated. “It could likely be from some online e-commerce deal, or energy trading, etc.”

Tracing how organizations access personal contact details is “extremely challenging for political parties because they often ignore inquiries,” Dev notes. “If we could uncover how this data was sourced, it would be groundbreaking.”


This marks the second time Dev has tried to trace how her data was accessed. Data brokers frequently buy and sell information to advertisers and other entities seeking insights into people without their consent.

After receiving numerous unwarranted calls last year, the developer tracked down who had her phone number. She returned to the real estate giant CoreLogic Australia, which informed her in 2023 that her data had been legally obtained from another data broker.

The company explained that it retrieved her data through a marketing campaign from 2014 and shared her information with at least 50 other companies.

Dev’s situation isn’t unique. Reports indicate that a child’s email, registered in a charity fundraiser over a decade ago, also received political spam from the Liberal Party during the recent election.

How did you get my number?

Understanding how marketers and others acquire your contact details and personal information is complex.

Katherine Kemp, an Associate Professor at the University of New South Wales leading the Public Interest Law and Technology Initiative, explains that this often occurs through data matching services that merge personal information from various service providers, subsequently sold via data brokers.

Kemp mentions she has encountered mortgage brokers in the mortgage industry.

Yet, discovering how they obtained that information can be quite elusive, Kemp adds.

When she inquires where people got her details, she often hears, “They either evade the question or abruptly hang up… providing vague answers, and if pressed, they quickly end the call.”

Federal Privacy Commissioner Carly Kins described the data-broking sector as “extremely opaque” and possessing “a convoluted value chain of personal information.”

“Thus, people are left in the dark, lacking authority to voice their concerns,” she states.

“Many find it unsettling. I believe it’s concerning how personal information is relayed through data brokers, landing in unexpected places.”

Who are the data brokers and what do they collect?

A global data broker organization claims it will “facilitate the exchange of information for consumer benefit and support to Australian businesses and small enterprises,” as stated in a 2023 submission to the Australian Consumer Watchdog’s data brokerage survey.

Information collected can include name, address, age, viewing habits, purchasing patterns, financial status, employment background, qualifications, tenant history, and other socio-economic and demographic details.




Dev notes discussions about expanding privacy responsibilities for political parties may lead to mandated transparency in how they obtain personal data. Photo: Hilary Warduff/Guardian

According to Reset.Tech Australia, last year’s report highlighted the types of data purchased and sold by brokers, including location tracking over time, financial details, tendencies related to gambling and drinking, and recent online purchases.

Data broker companies include credit reporting agencies, identity verification firms, news organizations, real estate companies, tenancy data firms, marketers, loyalty programs, and social media platforms.

“Unpleasant” Australians whose personal information is sold

The Australian Competition and Consumer Commission uncovered in last year’s data broker report that privacy policies often use “ambiguous language,” complicating consumers’ ability to identify data sharing practices. This makes it tougher to ascertain who has their data and opt out of such collections.

The average privacy policy contains 6,876 words, and it reportedly takes 29 minutes to read.

A survey featured in the report indicated that 74% of Australians are uncomfortable with the idea of their personal information being shared or sold.

Some companies attempt to downplay privacy concerns by outlining data collection to consumers and offering access to information held upon request. A consumer group found last year that a data broker claimed it didn’t retain data from loyalty program members.


Privacy Commissioner Kins noted that claims from data-collecting companies suggesting that personal information falls under the Privacy Act could be a “creative interpretation” of the law.

According to the ACCC, identified data, when aggregated with data points from other sources, still poses a risk of consumer identification.

Overall, without naming specific companies, many Australians expressed that some practices within the data brokering industry feel “very uncomfortable at best, often indicating serious breaches of trust.”

“Data transfer often occurs multiple times, creating a complicated environment. While much of this is legal, [privacy] practices remain vague and often reside in a gray area with minimal legitimate actions.”

Regulators can flex their muscles

Although the ACCC report didn’t issue any recommendations, it endorsed the strengthening of privacy laws in Australia.

Commissioner Kins indicated that the ACCC’s work could pave the way for her office to investigate practices in the sector, stressing that current privacy laws contain “various elements that could restrict practices applicable to data brokers.”

“I want to prioritize this issue, and my regulatory team is currently contemplating how to utilize our authority in this realm,” Kins remarked.

Dev emphasizes the need to discuss expanding privacy responsibilities for political parties to ensure they transparently disclose how they acquire personal data.

The current exemption allows political parties to bypass her inquiries concerning what data they hold, according to Dev.

Kemp believes there’s potential for stricter laws concerning data brokering, but expresses little desire to alter the legal obligations of political parties.

“However, I don’t believe we should abandon discussions on reform in this area.”

Source: www.theguardian.com

Google to Pay $1.4 Billion to Settle Dual Privacy Lawsuits

On Friday, Google consented to pay Texas $1.4 billion, facing accusations of violating state residents’ privacy related to two lawsuits concerning location tracking, search history, and facial recognition data collection.

Attorney General Ken Paxton, who facilitated the settlement, initiated a lawsuit in 2022 under Texas’ data privacy and deceptive trade practices legislation. Less than a year later, he achieved a $1.4 billion settlement with Meta, the parent company of Facebook and Instagram.

This settlement marks another legal challenge for the tech giant. In the last two years, Google has faced a series of antitrust cases, revealing its significant control over app stores, search engines, and advertising technology. Recent legal battles have sought to counter the U.S. government’s requests to break up the company.

“Big tech must adhere to the law,” Paxton stated.

Google spokesperson José Castañeda remarked that the company has already revised its product policies. “This resolves numerous longstanding claims, many of which have found resolution elsewhere,” he noted.

Privacy concerns have caused significant friction between tech corporations and regulators in recent years. In the absence of federal privacy regulations, states like Texas and Washington have enacted laws to limit the collection of facial, voice, and other biometric data.

Google and Meta have been among the leading companies challenged under these regulations. Texas law, known as the Capture or Use of Biometric Identifiers, mandates that companies obtain consent before utilizing features like facial and speech recognition technology. Violators can face penalties of up to $25,000 per breach.

The lawsuit under this law centers on the Google Photos app, which facilitates searching for images of specific individuals. Future Google cameras may issue alerts upon recognizing visitors at a door. Moreover, Google Assistant is designed to learn and respond to inquiries from up to six users.

Mr. Paxton filed another lawsuit claiming that Google misled Texans by tracking their personal location data, even when they believed they had disabled the feature. He asserted additional grievances in the lawsuit, alleging that Google’s private browsing settings (known as Incognito Mode) were not genuinely private. These cases were filed under the Texas Deceptive Trade Practices Act.

Source: www.nytimes.com

Protecting Your Data and Privacy: Opting Out with your Tesla

Welcome to opt-out. A semi-regular column that will help you navigate online privacy and show you how to say no to surveillance. The final column covered ways to protect the privacy of your phone and data at the US border. If you would like to skip to a section for specific tips, click on the “Jump to Jump” menu at the top of this article.

When you press the button, Tesla will use a feature called Summon to pull you out of the parking lot that is not behind the wheels. Use autopilot to drive yourself on the highway. Once you arrive at your destination, a feature called Sentry mode allows you to record nearby activities while parked.

To effectively operate any of these functions, your vehicle will need to monitor and collect large amounts of data about you. Most Tesla vehicles are equipped with nine internal and external cameras. Information from Tesla, delivered via location trackers, sensors, and more, can draw complex pictures of your life and movements.

“Teslas is truly a surveillance platform,” said John Davisson, senior advisor and litigation director at the Digital Rights Research Group’s Center for Electronic Privacy Information. “There are some privacy safeguards Tesla offers, which allows you to opt out of data sharing if Tesla actually follows those commitments, at least in terms of settings.”

Tesla does not only collect a substantial amount of data to enable a range of advanced features in the vehicle. Connected cars collect some information to work. Given the role of CEO Elon Musk in the Donald Trump administration, some Tesla owners are renewed interest.

These concerns are not entirely unfounded, according to Reem Suleiman, advocacy director for the Mozilla Foundation, who gave Tesla “no privacy.” Grade year In assessing the data security practices of automobile manufacturers. First, the company’s track record of protecting user data is unstable. 2023, a A Reuters investigation found Employees were sharing sensitive footage shot by the vehicle at the internal messaging forum. a Washington Post Analysis Of the transparency report published by Twitter, after Musk took over similarly, X acquiesced to X’s 20% government demands to remove content in the first half of 2024.

We spoke with privacy experts, consulted Tesla’s privacy policy, and asked Tesla’s own AI chatbot how to share the smallest amount of data with Tesla as much as possible. If you own a Tesla, you should take every precaution you can and in many cases you should take with your vehicle. However, more information: Adjust these settings to shut off access to many of the car’s features to share minimal data with Tesla.

“They say that if you choose to opt out of data collection, your car is essentially lemon,” Suleiman said. “They basically say that your vehicle can reduce functionality, serious damage, or impossibility. So, when you think about it, it’s kind of non-selection.”

Here’s what you need to know about your Tesla vehicle privacy:

What information does your Tesla gather about you?

Tesla’s privacy policy details the wide range of data the vehicle collects. Most of it is stored locally, but will also be sent to the company unless you change your settings.

According to Tesla’s AI chatbot, the information collected with you includes: “Location data (Tesla does not record or store vehicle-specific GPS information except in the case of a crash); driving habits and behaviors (speed, brake patterns, acceleration, etc.); diagnostic and vehicle usage data (Tesla product and services data and voice data and automobile) data (camera records, sensor measurements and other inputs to support advanced safety features).

The good news is that if you opt out of sharing information with the company, you can collect and store data on your vehicle’s local drive. The bad news is that law enforcement agencies have access to data if they get a vehicle in any way during a traffic stop or other situation, according to Davisson of the Electronic Privacy Information Center.

And it’s not just your car that collects information. The Tesla mobile app, which allows you to activate or summon Sentry mode, also collects location, contacts, browsing and device information. Read how to dial back how much your car is monitoring you.

Who does Tesla share your data with?

If you choose to share your data with Tesla, all data it collects may be shared with third parties, including law enforcement. In addition to sharing data with law enforcement when the company receives a valid court order, Tesla also shares data when it is “essential to protect national security or public safety,” according to the company’s privacy policy. Davisson says the language is ambiguous and opens the door to sharing video footage with law enforcement for many reasons. Recently, for example, the FBI has classified vandalism in Tesla vehicles and showrooms as “.Domestic Terrorism”.

“It’s a very short hop to think of immigration enforcement as a security issue that requires constant sharing, especially now, when everything is clearly a national emergency. Davisson is in real-time footage of ice and other law enforcement officers. “And that’s a horrible situation.”

So how do you protect your data?

Consider other vehicles

When it comes to privacy, everyone has a different risk profile. Some people may be more concerned than others about who has access to their personal information. Tesla’s exterior cameras can also collect footage of people around the vehicle, so it’s not just your own privacy that you need to be aware of.

If you’re worried about gathering information and don’t have a Tesla yet, privacy experts say you should choose a different car. If you have and want to regain your privacy, sell it. Even Tesla’s AI chatbots gave similar advice.

“If someone is deeply concerned about their privacy, they may consider alternatives such as: choosing a vehicle manufacturer with more restrictive data collection practices than 1 (this may be at the cost of reducing convenient features).

Skip past newsletter promotions

Reset your car at the factory before selling

So you already bought a Tesla and you decided to sell it. Suleiman and the Mozilla Foundation strongly recommend that you receive your car in the factory before you do so. This means your car will wipe out all its data, as if it’s off the lot.

“Just because you sold a car doesn’t mean you scrub your data,” Suleiman said. “That requires a bit of due diligence.”

Therefore, before you remove the vehicle from your Tesla account, you will need to factory reset your vehicle from your vehicle’s settings. Sit in the driver’s seat and follow these steps:

1 Go to Controls > Services > Factory Reset.

2 Enter your Tesla account username and password to confirm your credentials.

3 Make sure you perform a factory reset.

“This will erase all your personal data, including saved addresses, music favorites, imported contacts, and restore your car settings to factory defaults,” according to Tesla’s AI chatbot. The company also recommends removing “Homelink” devices. This allows you to control things like garage doors, lights, or home security systems from Tesla. You can click the Homelink icon at the top of the Controls screen to go to Homelink Settings and delete connections to other devices.

How to share the minimum amount of data with Tesla

In the vehicle and Tesla app settings, you can choose not to share your data with your company. But like I said before, be prepared to lose functionality in your car.

Opting out of data sharing in your car is pretty easy. In the vehicle,[設定]>[ソフトウェア]>[データ共有]Go to. Turn off Allow data sharing. This disables analysis, road segments, diagnostics, and sharing of vehicle usage data. Tesla’s AI chatbot recommended turning off the “Allow Autopilot Analytics” off to stop sharing of autopilot-related data.

Opt out of mobile app locations and data sharing. In the Tesla app, you will be taken to Settings (the button looks like three horizontal lines or gear icons). Next, scroll down to Security and Privacy. after that Select Turn off the location service. click Turn off analytics and stop sharing app usage data with your company. Next, revoke access to your camera, microphone, and contacts.

“By following these steps, you’ll be sharing the minimum amount of data with Tesla, while enjoying the core functionality and functionality of your mobile app,” according to Tesla AI Chatbot.

However, this will affect the functioning of your car. According to Tesla’s chatbots, data sharing affects these features.

< Source: www.theguardian.com

Exhibition Chronicles the Evolution of UK Privacy Rights: From Floppy Disks to Vaccine Cards

Four decades ago, a 4-drawer filing cabinet was necessary to house 10,000 documents. Now, it only requires 736 floppy disks to hold the same volume of files. The cloud allows for the storage of 10,000 documents without occupying physical space.

With the evolution of data storage comes a transformation in the information landscape. This evolution poses challenges related to the storage, transfer, and proper utilization of individuals’ personal data.

The Information Commissioner’s Office (ICO) organized an exhibition at the Manchester Central Library this week, showcasing 40 items that demonstrate the evolution of data privacy. Each item illustrates how access to information has changed over the past four decades and how data has become pivotal in major news events.

John Edwards, a member of the intelligence committee, expressed his appreciation for the exhibition, emphasizing the importance of human influence in data-related matters. He highlighted the significance of understanding terms like data controller, data processor, and data subject.

The exhibition features various items, including Pokemon toys, floppy disks, Tesco Club cards, modems, Millennium bug brochures, soccer shirts, and Covid vaccination cards. It also showcases how ICO interventions have brought about societal changes, such as ending the construction industry’s “employment denial list” and implementing public food hygiene assessments for restaurants.

One of Edwards’ favorite exhibition items is the spiked lawn aerator shoes, symbolizing an early enforcement action in the 1980s against a company selling customer information obtained from shoe sales.




My favourite item at the exhibition by Intelligent John Edwards is the spiked grass aerator shoes. Photo: Christopher Tormond/The Guardian

The 40th pedestal at the exhibition remains unused, inviting the public to suggest objects that have influenced the data landscape. Edwards emphasized the personal and subjective nature of privacy, stating that each individual has unique expectations and experiences.

The ICO was founded as a UK data protection regulator near Manchester 40 years ago and now oversees new data protection laws. The regulatory landscape has undergone significant transformations since its inception.




NHS Covid Vaccination Card. Photo: Andy Rain/EPA

According to Edwards, individuals now have a significantly larger amount of personal data worldwide compared to when the ICO was established. The constant flow of data worldwide illustrates the extensive data environment we now exist in.

Edwards highlighted the challenge of keeping pace with the rapid changes in technology and data usage. The ICO regulates a wide range of entities, from small schools and GP surgeries to large social media companies, requiring continuous adaptation to address privacy implications.

Reflecting on the future, Edwards acknowledged the uncertain geopolitical landscape, emphasizing the potential impact of quantum computing and advanced AI technologies on data handling and privacy in the coming years.

Source: www.theguardian.com

Are there privacy concerns with the “Magic Eye” surveillance cameras at the Mental Health Unit’s NHS?

In In July 2022, Morgan Rose Hart, an aspiring veterinarian with a passion for wildlife, passed away after it was found unresponsive in the Essex mental health unit. She’s just turned 18. Diagnosed with autism and attention deficit hyperactivity disorder (ADHD), Hart’s mental health was affected by bullying, which forced her to move from school several times. She previously tried to take her life and was transferred to Harlow’s unit three weeks before her death.

Hart, from Chelmsford, Essex, passed away on July 12, 2022 after it was found unresponsive on the bathroom floor. The prevention report of future death reports issued after her questioning turned out to be overlooked, and it turns out that important checks were missed, observation records were forged, and risk assessments were not completed.

Investigation by observer And newsletter Democracy for Sale Her death has established that she is one of four, including a high-tech patient surveillance system called Oxevision, which is deployed in almost half of mental health struts across the UK.

Oxevision’s system allows you to measure the patient’s pulse rate and breathing, interfere with the patient at night, and also broadcast CCTV footage temporarily if necessary, without the need for a person to enter the room. The high-tech system can detect a patient’s breathing rate, even when the patient is covered with a futon.

Oxehealth, which was spin-out from the University of Oxford’s Institute of Biomedical Engineering in 2012, has agreed to 25 NHS mental health trasts, according to its latest account, reporting revenue of around £4.7 million by December 31, 2023.

However, in some cases, it is argued that instead of doing physical checks, staff rely too heavily on infrared camera systems to monitor vulnerable patients.

There are also concerns that systems that can glow red from corners of the room could exacerbate the pain of patients in mental health crisis, which have increased their sensitivity to monitoring or control.

Sofina, who had experience being monitored by Oxevision among patients and who asked not to use her full name, stated:

“The first thing you see when you open your eyes, the last thing you do when you fall asleep. I was just in a visually impaired state. I was completely hurt.

Advocates argue that the technology can improve safety, but this weekend there will be a call to stop the deployment of Oxevision, raising concerns about patient safety, privacy rights and the conflict of interest in research supporting its use. The campaign group said Oxevision was often installed in patients’ bedrooms without proper consent, with surveillance technology likely causing distress.

In a prevention report of future deaths issued in December 2023 after Hart’s questioning, the coroner pointed out that if a person was in the bathroom for more than three minutes, a staff member would “have to complete a face-to-face check.” Instead, “Oxevision Red Alert has been reset” by staff and Hart was not observed for 50 minutes, and was discovered to be “not responding on the bathroom floor.”

The coroner expressed concern that “some staff may have used Oxevision in their place of instead of just an aid to face-to-face observation.” The conclusion of the judge’s investigation was death from misfortune, which contributed to the contributions of negligence.

Two days before Hart’s death, Michael Nolan, 63, a warehouse operator at risk for self-harm, passed away as a mental health patient at Basildon Hospital. The study said staff used Oxevision as an alternative to physical observations and failed to carry out effective observations. The story’s verdict by the judge included the findings of inadequate training on the Oxevision system.

The following month, 27-year-old Sophie Alderman, who had a history of self-harm, passed away in a patient at Rochford Hospital under the custody of the University of Essex Partnership NHS Foundation Trust. Her family says the Ooshivision system caused her pain and hurt her mental health. A few months before her death, she complained about the camera in her room, but she believed it was hacked by the government.

Tammy Smith, Alderman’s mother observer: “I don’t think Oxevision is effective in keeping patients safe. It’s a major invasion of patient privacy.

“Staff aren’t properly trained or used properly on it. People have died while Oxevision is in use, and questions have been raised about its use. That’s enough to pause deployment and actually consider whether this technology will keep patients safe.”

The Care Quality Committee also raised concerns. “A sad death was found in the safety room,” said the NHS Foundation Trust’s testing report, which was released last February. [St Charles hospital in west London] If staff were not fully involved and monitored patients, they were dependent. [Oxevision] It was turned off at the time. ”

The Trust said this weekend that a “tragic death” in March 2023 led to the firing of three individuals, with the use of technology never being designed to replace responsibility and care from staff.

The Lampard study, which examines the deaths of mental health hospitalized patients under the control of the NHS Trust in Essex between January 2000 and December 2023, is being asked to investigate Oxevision.




Sophina of a former patient monitored by Oxevision.

Bindmans, a law firm representing Alderman’s family and another patient’s family, spoke to Baroness Lampard about the concerns about consent and the safety and effectiveness of the system. He said there are concerns that staff may delegate the responsibility to monitor patients to “Digital Eye.”

A review by the National Institute of Health Therapy, published in November and commissioned by the NHS England, examined nine studies on Oxevision along with other studies, finding “inadequate evidence” suggesting that inpatient mental health unit surveillance techniques achieve intended results and achieve “improve safety, improved costs, etc.”

Only one of these papers was rated as “high quality” for their methodology and no conflicts of interest were reported. All eight other studies report all conflicts of interest, all related to Oxehealth. In some cases, OxeHealth employees were co-authors of the paper.

“There’s no independent research done. There’s almost always been involvement of the companies that create and market these devices,” said Alan Simpson, professor of mental health nursing who co-authored the review.

The Stop Oxevision campaigner said he was worried about the threat that technology poses to patients’ “safety, privacy and dignity.”

Lionel Tarassenko, professor of electrical engineering at Oxford University and founder of Oxehealth, said Oxevision only intermittently broadcast CCTV footage of patients. This is up to 15 seconds, and if clinical staff respond to alerts, they will only see blurry videos.

Tarassenko Lord said the paper reviewed by the National Institute team showed the benefits of Oxevision, including reduced self-harm, improved patient sleep and safety. He added that it was written by an independent clinician who maintains editorial control and in some cases, OxeHealth co-authors were included to reflect their contributions.

He said: “There is no evidence that proper use of Oxevision technology is a factor that contributes to inpatient deaths. The experience of Oxevision patients is very positive.”

In a statement, the company said the Oxevision platform was NHS England Principles Regarding mental health digital technology, the decision to use technology, announced last month for inpatient treatment and care, said it must be based on consent.

The company said: “Oxevision supports clinical teams to improve patient safety, reduce incidents such as falls and self-harm, and ensure staff can respond more effectively to clinical risks,” he said, adding that it welcomed the dialogue on responsible ethical deployment of technology.

Paul Scott, chief executive of the University of Essex Partnership NHS Foundation Trust (EPUT), said that his patient’s death was devastating because he was in charge of caring for Hart, Nolan and Alderman, and that his sympathy was sympathetic to those who lost loved ones. He said: “We are constantly focused on providing the best possible care and use remote surveillance technology to enhance safety and complement the treatment care and observations our staff has implemented.”

A spokesperson for NHS England said: “Vision-based surveillance techniques must support a human-based rights approach to care, be used only within the scope of legal requirements, and ensure that patient and family consent is implemented.

A spokesman for the Ministry of Health’s Social Care said: “These technologies should only be used in line with robust staff training and appropriate consent, with robust staff training and appropriate consent, and are transforming the care that people facing a mental health crisis receive by modernizing mental health law.

Source: www.theguardian.com

Protecting Mobile and Data Privacy during US Immigration.border crossing

Welcome to opt-out. A semi-regular column that will help you navigate online privacy and show you how to say no to surveillance. The final column covered what to do with the 23andMe account after the company filed bankruptcy. If you would like to skip to a section for specific tips, click on the “Jump to Jump” menu at the top of this article.

If you are a visa or green card holder who has plans to travel to the US, reports that people have been ousted at the airport for messages found on their devices may encourage you to make a second trip plan. You may ask Customs and Border Protection (CBP) if you can search for your phone, if you can opt out, and what to do to minimize the risk.

The simple answer is yes, CBP can search for devices. Constitutional protections are generally weak at US borders, including airports. You can try to opt out, but in some circumstances you may be willing to risk the potential consequences of not complying, which can include confiscation of your device.

Privacy experts say everyone needs to conduct a personal risk assessment. This should include immigration status, travel history and data on your mobile phone. Depending on your situation, data that may not be sensitive to others will not fit every solution into every solution. For example, if CBP attempts to search for a mobile phone or wants to lock down your device before heading to the airport, it could affect its rating.

Although CBP said it searched only about 47,000 devices of 420 million people crossing US borders in 2024, it was not as easy to figure out whether there is a risk of device searching, as the Guardian said border enforcement was unpredictable under the Trump administration. French officials said French scientists were recently kicked out at a Texas airport after immigration officers found texts critical of Trump on his phone.

“The ultra-conservative perspective is to assume that they are completely hinged and that even the most benign reasons for travel are targeting non-citizens in searching for these devices,” said Sophia Cope, a senior staff lawyer at the Electronic Frontier Foundation (EFF), a nonprofit digital rights group.

If you are a US citizen, you must be hospitalized in the country. That said, there are still some risks in some jurisdictions that devices are being searched for domestic reasons as CBP will allow them to work with the FBI or local police to proceed with domestic investigations.

There are steps you can take to make it difficult for CBP executives to access the device and its data. So, what should you do to protect your mobile phone data from searching? The main thing is to prepare before heading to the airport. This is what you should think about:

Before you travel, start preparing for the possibility of being drawn into a secondary screening. First, you need to determine whether immigration officers will comply with whether they ask if they can search for devices. They may request a password for your mobile phone, or you may unlock the device. Ideally, you will unlock your device yourself and will not share your password. You can decide not to give your consent, but it has its own risk.

From a Guide to the border From EFF: “This presents a no-win dilemma. If the traveler is compliant, the agent can scrutinize and copy sensitive digital information. If the traveler declines, the agent can seize the device and expose the traveler to additional questions and detention.”

If you reject a search, the officer may decide that you are not worth the trouble because you are not at high risk. They may let you go. However, on the back, the decline can lengthen the process or the border agent can confiscate the device. If they’ve confiscated your device, then you can confiscate your device so that they can document that they have your device when you try to get it back, and that they can document that they have your device when you try to get it back. Without giving up the password, immigrant staff can unlock it themselves using a variety of tools on their devices. You can also try to guess your password, so make sure you have a strong, long password.

There are many reasons why you might not want to risk being suppressed or confiscated from your device for longer than you have already done.

If you plan to adhere to phone searches to avoid further complications, you can search your phone manually or with forensic tools. It’s worth preparing for both types of searches.

Turn off your phone and ID before entering the US

EFF recommends turning off your device completely before entering the US. This could potentially bring your phone or laptop back to an enhanced security state, which could make it difficult for anyone to break the encryption on their devices.

Privacy advocates also recommend that you ensure that your device requires a password to decrypt or unlock it. For example, if you are using a Face ID or fingerprint to unlock your phone, it will allow executives to use it to access the device.

Please do not wipe the phone

You may think that the most protective option is to wipe your phone completely before traveling, use a burner without a phone, or travel. However, EFF’s COPE said it could actually raise doubts.

“If people do that, they’ll feel bad and they’ll just ignore them,” Coop said. “If you cross a border without data on your device, you can in itself consider it suspicious.”

Instead, we suggest selectively delete that information, rather than wiping the entire device, if there is data or text that appears to be cooperative but you don’t want to access, instead of wiping it all out.

Skip past newsletter promotions

Encrypt your data and use strong passwords

The most important step to take before you travel is to encrypt the data on your device. This is different from using encrypted messaging services such as signals. Device encryption makes it difficult for CBP executives to access files on their phones or laptops or recover deleted files, even if they confiscate their devices and submit to sophisticated forensic tools.

Fortunately, All recent models Full device encryption will automatically be turned on for both iPhones and most Android phones. On Android, the “Security” menu is[詳細設定]Double check that yours is turned on in the tab. As CBP cannot walk through the front door of the device, you need to choose a strong password that is not easy to guess. Here is Good primer How to create a strong password.

“This encryption is as good as the encryption passphrase someone uses on their devices,” said Bill Buddington, senior staff technician at EFF. “So the best advice is to choose a powerful 9-12 random (or 4-5 words) passphrase for your device and make sure that biometric unlocks like face ID and touch ID rotate. off You can confiscate any device when passing through sensitive areas such as checkpoints or anywhere else. ”

On the other hand, laptops don’t come with all device encryption. Some encryption tools can be used to encrypt your data. MacOS has a tool called FileVault that can be accessed by searching in the top right corner of the screen. Some Windows computers come with a tool called a bitlocker that can be used to encrypt devices. EFF has a complete list of tools that can be used on various operating systems here.

For those traveling with devices owned by their employer or someone else, you should have conversations with them before you travel to make sure your devices are well protected.

How to safely delete data

In addition to encrypting your device, you should not want to delete certain text, apps, photos, etc. that you think are sensitive or show it to government agents.

To safely delete this data, there are several steps and limitations. If you haven’t wiped your phone completely, you may choose to delete certain files as there may be suspicion. That more practical option may be effective for manual searches and rough searches, but may not be sufficient if more sophisticated searches occur with US immigrant personnel. The files may not be completely deleted, or there may be references to these files that are still on your device.

In addition to ensuring that your device is encrypted, you must also make sure you have deleted the files from the Trash. For example, in iMessage, clicking Filter in the top left corner will find the “Recently Deleted” folder. Make sure you cleared the text from there. On iPhone, there is one file It has been deleted Both the main Imessage interface and the “recently deleted” file are permanently deleted, according to the company.

Cope recommends preemptively deleting apps you don’t want to search for. This protection method is incomplete as advanced searches may reveal that the app is installed, but for example, in manual searches, it is a way to avoid searching for WhatsApp messages.

Go to the cloud storage server

During a search for law enforcement within US boundaries, cloud storage servers are less protected than devices. However, at the border, there is currently a policy in place that prohibits CBP from searching for online cloud services. In reality, that means that immigration officers need to go into plane mode before searching for their mobile phones.

“They specifically say that executives are only allowed to consider data that they are "resident on the device,"” Cope says. “It’s data that’s actually on your phone, laptop, or camera hard drive. If it’s an internet-connected device, you’re supposed to be disconnected from the internet.”

If for some reason there is data that cannot be permanently deleted or cannot be deleted, you can delete it from your device and save it to cloud storage such as iCloud, Google Drive, or Microsoft One Drive.

This is a high-level guide that may not touch on the details of your situation. A complete comprehensive guide on how to protect your device with Electronic Frontier Foundation.

Source: www.theguardian.com

Privacy group claims that the gov.uk app could pave the way for a mandatory ID scheme

Privacy athleticist is a new app that emphasizes the risk of using citizens’ driver’s license, passport, and welfare documents as an “essential ID scheme lunch pad”.

Last week, Peter Kyle, a technology secretary, introduced a GOV.UK app and a GOV.UK wallet plan.

These include proof of rights to work in the UK, welfare rights, veteran ID cards, and DBS certificates. This technology incorporates biological authentication security like face scanning. Similar electronic government apps have been implemented in countries such as Poland, Estonia, and Iceland.

Kyle suggests that while paper documents will still be used, the convenience of the app is “inevitable” as more people find it “convincing.” He is working diligently to make this a reality.

Kyle explains that this app features a digital document wallet similar to those found on Apple and Google smartphones, “completely revolutionizing shopping, banking methods, and travel methods.” Ta.

Mock -up of digital driving license pages stored in Gob.uk ‘Wallet’ on a smartphone. Photo: Faculty of Science/PA

However, the athleticist is now requesting transparency regarding the impacts of new systems on privacy before deployment.

“Kyle may not be in charge of the department for science, innovation, and technology, but the future government may easily utilize optional digital wallets as a fundamental ID scheme launch pad.”

“By including our face recognition data, this vast identity system becomes highly sensitive and tempting for hackers, becoming a honeypot.”

James Baker, Open Rights Group campaign manager, comments:

“Are we heading towards a world that prioritizes convenience? One of the potential concerns is the development of a national identity database where all interactions are monitored.

Kyle unveiled this technology this week with a presentation inspired by Silicon Valley launches. He mentioned that the app will be launched in June, and the wallet will continue to “securely store government-issued digital documents.” The hope is that this app will integrate with various government services, from paying vehicle tax to managing welfare benefits and interacting with NHS.

Authorities are using apps and wallets that are highly secure on modern smartphones, incorporating facial recognition checks similar to those used for digital bank card payments. There is no central database of documents stored in the wallet, requiring hackers to access individual mobile phones.

Skip past newsletter promotions

A government spokesperson explains: “By using the same technology as securing a digital bank card, even if the mobile phone is stolen, access by the thief is not possible, making the digital driver’s license much more secure than a physical copy.

“It is evident that digital identity is not only advantageous but also opens up new possibilities, such as maintaining call centers and in-person support for those requiring assistance with digital services.”

The credentials stored in the wallet should be more challenging to forge than paper documents. Additionally, consolidating numerous essential documents in one place eliminates the risk of losing or damaging physical copies with digital updates and streamlined paperless applications.

It also enhances privacy in certain scenarios, for example, citizens no longer need to reveal their names and addresses when purchasing age-restricted items like alcohol or fireworks.

If this system is outdated or misused, the government reserves the right to revoke qualifications as needed. However, if this is done unjustly, citizens can still utilize paper copies.

Source: www.theguardian.com

Privacy Concerns Rise as Smart Devices Gain Popularity in Australian Homes

Do you have any smart devices or home appliances that can be controlled remotely in your house? These devices have become a common feature in modern homes over the past decade, offering convenience but also raising concerns about privacy. These smart devices collect, share, aggregate, and analyze data, posing potential risks to personal information. According to Katherine Kemp, an expert in law and data privacy, privacy laws in Australia are not up to date, which is a global concern. The information collected by smart devices can be used for targeted advertising, and it’s unclear where this data ends up.

While smart devices offer benefits such as environmental friendliness, Kemp believes that their main purpose is to collect and sell more information rather than promoting environmental sustainability. There’s a concern that companies use this data for targeted advertising and other commercial purposes, potentially creating detailed profiles of individuals.

Concerns about privacy and consent models have been raised by Sam Floreani, the policy director at Digital Rights Watch. The collection and use of data depend on underlying incentives, and it’s essential for individuals to fully understand the implications of sharing their data. He also mentioned the need for improving consent laws and rights around personal data.

Australia’s current privacy laws require consent, but customers are not always given the right information to make informed choices. The government is planning an overhaul of the law to bring it into the “digital age” and strengthen enforcement powers for privacy watchdogs.

Convenience and privacy

Some argue that sacrificing privacy for convenience is worth it, especially if it improves accessibility. For the visually impaired community, smart devices play an important role in reducing social isolation. However, concerns remain about the trade-off between convenience and privacy.

“That’s too tempting.”

Early concepts of smart homes focused on collecting data solely for the occupants’ purposes. However, the potential for lucrative behavioral advertising services led to a shift in the use of this data. Changes in privacy laws are needed to establish stricter standards for how companies behave regarding smart devices.

Source: www.theguardian.com

The potential negative impacts on privacy of refusing cookies compared to accepting them

When you open a new website, do you choose to either accept or reject all cookies? New research suggests that choosing to reject cookies may actually reveal more information about yourself than if you simply “accept all cookies.”

In this digital age, cookies, which are pieces of data, are stored on our devices for various purposes such as remembering login information. However, cookies can also be used to track online behavior and tailor ads to individuals. This has led many people to refuse cookies in order to keep their personal information private and to prevent companies from using their data for marketing purposes.

Surprisingly, recent research presented at NeurIPS 2023 has shown that certain demographics are more likely to reject cookies, which could have implications for advertisers. Contrary to popular belief, rejecting cookies does not necessarily mean hiding one’s identity.

Results from the study showed that acceptance of cookies varies depending on the country and age of the individual. For example, individuals in the US over the age of 34 were more likely to refuse cookies, inadvertently sharing more of their data.

Advertisers could access five pieces of information from a person who accepts cookies, but only two pieces from a person who rejects them. In other words, declining cookies may not necessarily protect your identity. This demographic, which includes older adults, has been found to have less trust in tech companies and is more likely to reject cookies as a result.

It is important to note that the privacy laws in the United States are not as stringent as those in the EU, meaning that user data may not be as protected, especially for individuals who reject cookies.

Privacy is important to some people, but not to everyone

Researchers have found that older Americans are less likely to accept cookies, with only 28% of them choosing to do so. This compares to 40% of younger Americans who typically accept cookies. This indicates that the decision to reject cookies has a greater impact for younger Americans in protecting their identity.

The authors of the research are hopeful that their findings will help shape regulations for new technologies such as AI and inform policymakers. They emphasize that the decision to accept or reject cookies is not as simple as it may seem and may not always produce the intended effect.

So… does it make sense to refuse cookies?

Ultimately, the impact of privacy decisions is more complex than users may realize, and there are no definitive answers. Some recommendations include using more privacy-friendly browsers and supporting stronger privacy laws and regulations.

About our experts

Dr. Elizabeth Daly is a computer research scientist leading the Interactive AI Group at IBM Research in Dublin. She has published research in journals such as Frontiers of Artificial Intelligence and Artificial Intelligence Research Journal.

Dr. Eric Mealing is a researcher at IBM Research with publications in Advances in Neural Information Processing Systems, American Control Conference (ACC), and International Conference on Decision Making and Game Theory for Security.

Source: www.sciencefocus.com

India’s New Telecommunications Law raises Privacy Concerns as it Clears the Way for Musk’s Starlink

With more than 1.17 billion phone connections and 881 million internet subscribers, India aims to modernize connectivity and introduce new services such as satellite broadband just months before general elections. Congress passed a telecommunications bill that replaced the 100-year-old rule.

India’s upper house of parliament on Thursday approved the Telecommunications Bill 2023 by voice vote, with many opposition leaders absent due to suspension, just a day after the bill was passed by the lower house. The bill would repeal rules dating back to 1885 during the telegraph era, giving Prime Minister Narendra Modi’s government a mandate to use and manage telecommunications services and networks in the interest of national security, and to It gives the authority to monitor data. There is also a basis for the Indian government to intercept communications.

A newly passed telecommunications bill also allows spectrum to be allocated to satellite-based services without participating in an auction, and OneWeb wants to launch satellite broadband services in the world’s most populous country. The move is to give preferential treatment to companies such as , Starlink, and Amazon’s Kuiper. A long-standing demand for a “management process” surrounding spectrum allocation auctions. India’s Jio is trying to compete with three global companies with its homegrown satellite broadband service, but has relatively limited resources and has previously faced administrative opposition to its spectrum allocation model. Ta.

The bill also requires biometric authentication for subscribers to limit fraud and limits the number of SIM cards each subscriber may use. Additionally, it includes provisions for civil monetary penalties of up to $12,000 for violations of certain provisions and up to $600,400 for violations of conditions established by law.

The bill includes amendments to the Indian Telecom Regulatory Authority Act, 1997, targeting the telecom regulator, as the Indian government seeks to attract foreign investors by increasing private participation. These amendments would allow executives with more than 30 years of private sector experience to be appointed to regulatory agency positions. The chairman can become a member if he or she has served for 25 years or more. The country previously allowed only retired civil servants to serve as chairmen and commissioners of regulators.

“This is a very comprehensive and very large-scale structural reform born out of the vision of Prime Minister Shri Narendra Modi Ji. The legacy of old fraudsters in the telecom sector will remain and this bill Arrangements will be made to make the telecom sector a rising sector through this,” said Ashwini Vaishno, India’s Telecom Minister, while introducing the bill in Parliament.

Interestingly, the Telecommunications Bill excludes the term “OTT” that was used in the first draft last year, setting out regulations for over-the-top (OTT) messaging apps such as WhatsApp, Signal, and Telegram. . Industry groups such as the Internet and Mobile Association of India, whose members include Google and Meta, have praised the changes. However, the scope of the regulation is not clearly defined throughout the document. Shivnath Thukral, head of India public policy at Meta, warned in an internal email that the government may have the power in the future to classify OTT apps as telecommunications services and subject them to licensing regimes. report By Indian outlet Moneycontrol.

Digital rights activists and privacy advocates have also raised concerns about the ambiguity surrounding the regulations and the lack of public consultation on the final version of the bill.

Apal Gupta, founding director of the digital rights group Internet Freedom Foundation, said at a public event earlier this week that the bill lacks safeguards for those targeted.

“The Ministry of Telecommunications still refuses to create a central repository on internet shutdowns, thereby reducing transparency. We are completely ignoring the core of the required telecommunications rules.” he emphasized.

Digital rights group Access Now called for the bill to be withdrawn and a new draft to be drafted through consultation.

“This bill is regressive because it strengthens colonial-era governments’ powers to intercept communications and shut down the internet. It undermines end-to-end encryption, which is critical to privacy.” said Namrata Maheshwari, Asia-Pacific policy advisor at Access Now, in a prepared statement.

The bill is currently awaiting approval from the President of India to become an official law.

Source: techcrunch.com