Students Push Back Against AI-Taught Course: ‘I Could Have Just Asked ChatGPT’

Students at Staffordshire University expressed feeling “deprived of knowledge and enjoyment” upon realizing that the course they intended to pursue for their digital careers was primarily delivered through AI.

James and Owen were part of 41 students who enrolled in a coding module last year at Staffordshire, looking to make a government-supported career transition. apprentice A program aimed at preparing individuals to become cybersecurity experts or software engineers.

However, as AI-generated slides were intermittently narrated by an AI voiceover, James began to lose confidence in the program and its administrators, fearing he had “wasted two years” of his life on a course designed “in the most cost-effective manner.”

“If I were to submit something created by an AI, I’d be expelled from the university, yet we are being instructed by an AI,” James remarked during a confrontation with an instructor recorded as part of a course in October 2024.

James and his peers have engaged in several discussions with university officials regarding the use of AI in their coursework. Nonetheless, the university seems to persist in utilizing AI-generated materials for instruction. This year, it posted a policy statement on its course website rationalizing the use of AI, detailing a “Framework for Academic Professionals to Leverage AI Automation” in their academic activities and teaching.

The university’s foreign policy states that students who outsource assignments to AI or present AI-generated work as their own are breaching the integrity policy and could face academic misconduct charges.

“I’m in the midst of my life and career,” James lamented. “I don’t feel I can just leave and start over now. I feel trapped on this path.”

The situation at Staffordshire resembles this more and more. Universities are integrating AI tools to assist students, develop course materials, and provide tailored feedback. A Ministry of Education policy document released in August welcomed this trend, asserting that generative AI “has the potential to revolutionize education.” A survey conducted last year by education technology firm Jisc among 3,287 higher education faculty revealed that almost a quarter use AI tools in their teaching.

For students, AI education seems to be more demoralizing than transformative. In the US, students have voiced their discontent online in reviews about professors using AI. In the UK, undergraduates turned to Reddit to express frustration over instructors copying and pasting feedback generated by AI. Chat GPT or using AI-generated content in coursework images.

“I recognize there’s pressure compelling instructors to use AI, but I’m just disappointed,” commented one student. I wrote.

James and Owen realized “almost immediately” that AI was being utilized in their Staffordshire course last year, notably during their first class when the instructor presented a PowerPoint with an AI audio reading the slides.

Shortly thereafter, they began to notice indications that some course materials were AI-generated, including inconsistent editing of American and British English, suspicious file names, and “general, surface-level information” that sometimes cryptically referenced U.S. law.

Signs of AI-generated content persisted this year. In one course video uploaded online, the narration introducing the material shifted to a Spanish accent for approximately 30 seconds before reverting to a British accent.




Narration accent changes during lesson in allegedly AI-generated course – video

The Guardian examined the course materials at Staffordshire and utilized two distinct AI detectors (Winston AI and Originality AI) to assess this year’s content. Both indicated that numerous assignments and presentations were “highly likely to have been generated by AI.”

Ms. James reported her concerns during a monthly meeting with student representatives early in the course. Later, in late November, it was discussed in a lecture and incorporated into the instructional materials. In the recording, he requests the instructor refrain from worrying about the slides.

“Everyone knows these slides were generated by AI. We would prefer if they were discarded,” he stated. “I don’t want guidance from GPT.”

Shortly after, the student representative for the course responded, “We conveyed this feedback, James, and the reply was that instructors can use diverse tools. This answer was quite frustrating.”

Another student commented: “While there are some helpful points in the presentation, only 5% of it is useful. There’s valuable content buried here, but perhaps we can extract that value ourselves by consulting ChatGPT.”

The lecturer laughed awkwardly, saying, “I appreciate the honesty…” before shifting to discuss another tutorial he had created using ChatGPT. “Honestly, I did this on very short notice,” he added.

Ultimately, the course director informed James that he would not receive an AI experience in the final session, as the material would be evaluated by two human instructors.

In response to inquiries from the Guardian, Staffordshire University asserted that “academic standards and learning objectives were upheld” for the course.

“Staffordshire University endorses the responsible and ethical application of digital technologies in accordance with our guidelines. While AI tools may aid certain aspects of preparation, they cannot replace academic expertise and must always be utilized in a manner that preserves academic integrity and discipline standards.”

Although the university appointed a non-AI lecturer for the final lecture of the course, James and Owen indicated that it felt insufficient at this point, especially since the university seemingly continued to use AI in this year’s instructional materials.

“I feel as if a part of my life has been taken from me,” James stated.

Owen, who is in the midst of a career transition, explained that he opted for the course to gain foundational knowledge rather than merely a qualification, but he now believes it was a waste of time.

“It’s exceedingly frustrating to sit through material that lacks value when I could be dedicating my time to something genuinely worthwhile,” he remarked.

Source: www.theguardian.com

Prohibiting Phones in Schools Might Negatively Impact Certain Students’ Mental Well-Being

Some schools mandate that students store their cell phones in lockers throughout the school day.

Robin Utrecht/Shutterstock

Concerns about the adverse effects of excessive screen time are growing. Specifically, phone usage in educational settings can detract from learning experiences. However, some studies indicate that outright banning smartphones from schools can lead to feelings of loneliness among students, at least initially.

“When a school opts to completely prohibit smartphones, several factors should be considered,” explains Sanyogita Kare from Radboud University in the Netherlands. “Socially vulnerable youth may face additional challenges, leading to a possible sense of estrangement from peers.”

Mobile phones and similar devices have been linked to various issues, ranging from declining academic performance to worsening mental health among adolescents. However, there is a lack of concrete evidence supporting these claims. As of January 1, 2024, the Netherlands has instituted a ban on the use of smartphones and other smart devices in classrooms. Many schools enforce strict rules, prohibiting students from using such devices at all during class and often requiring that they be stored in lockers.

Seeking to unravel this impact, Carré and colleagues conducted a study with students from two middle schools in the Netherlands, both of which disallow smartphone usage during class. Surveys were administered initially in December 2023, prior to the nationwide ban, and repeated in March or April 2024.

The researchers aimed to analyze two types of loneliness: social loneliness and emotional loneliness. “Social loneliness pertains to your overall network and feelings of group belonging,” states Carre. “Emotional loneliness speaks to the depth of connection in close friendships.”

Upon comparing loneliness rates before and after the ban, findings were mixed. “Though we did not observe a significant change in social loneliness, there was a slight uptick in psychological loneliness,” Carre mentions. This surge may occur if some classmates are absent, thereby limiting students’ ability to connect with close friends during the school day.

The research also revealed that while the general sense of social loneliness didn’t rise among children, those who found social interactions more challenging were likely to feel increased loneliness. Carre notes that these impacts might not be long-lasting as students adjust to the ban over time.

A limitation of the study is that there was no comparison made with other Dutch schools where regulations are more lenient, such as allowing phones during breaks. Jonathan Canter from RAND, a U.S. nonprofit research organization, states, “To draw meaningful comparisons, we need data from similar students in schools without prohibition. Without that, we can’t ascertain whether our findings reflect broader patterns.”

Both Carre and Cantor express that there’s a gap in fundamental data regarding the effects of phone bans in schools. Canter and his team have recently attempted to address this issue by looking into voluntary bans on phones in U.S. schools.

They discovered significant variability in policies; certain schools enforce total bans on phones, while others permit their use at the discretion of teachers. In the UK, government guidelines suggest the use of phones in schools but leave the final decision to the discretion of school leaders regarding whether these devices should be banned.

“The pressing issue is the need for rigorous evaluations to assess the effectiveness of these policies at the school level. This requires detailed data on the types and timing of policies implemented,” Kanter states. “This is the focal point of current analysis efforts.”

topic:

Source: www.newscientist.com

Outrage from Parents: Meta Features Female Students’ Photos in Ads Aimed at Men

Meta has sparked controversy by utilizing a female student’s back-to-school photo to market one of its social media platforms to a 37-year-old man, a decision deemed “outrageous” and “upsetting.”

The man is pitted against Mark Zuckerberg’s Elon Musk’s X, receiving a post urging him to “get the thread.”

Meta employed the child’s images after they were shared on Instagram by their parents as school started back up. The parents were unaware that the meta setting allowed for this usage. One mother noted her account was private, yet the posts were automatically visible in public threads. Another mother mentioned that she posted the photo on her public Instagram account. Stranger interactions highlighted their child’s posts as “suggested threads.”

The recipient conveyed to the Guardian that the post felt “deliberately provocative and ultimately exploitative of the children and families involved.”

The father of the13-year-old girl featured in the post expressed that it was “absolutely outrageous.” All images portrayed female students in short skirts, showcasing bare legs or stockings.

“Discovering her image used by such a large company in a context that felt sexualized for product promotion made me feel quite sick,” he stated.

Meta, a massive company valued at £1.5TN, based in Menlo Park, California, defended its actions, stating the images did not breach their policies. They encouraged users to explore the thread by highlighting public posts that adhere to community standards and recommended guidelines. Although their system does not recommend threads shared by teenagers, these posts were generated from adult accounts that permitted public viewing.

The man who received the post noted that he was solely sent promotional content featuring female students, with no representation of boys in school uniforms, which he felt added a “sexualization aspect.”

One mother of a 15-year-old said the images were used in a promotional post featuring a prominent “thread” button. “It was a photo of my daughter heading to school. I never anticipated Instagram would utilize it for promotion. I absolutely hate it. She’s a minor,” she lamented.

She firmly rejected any agreement, stating, “It wasn’t for the money of the world. [its platform].”

Her Instagram account, typically modest with 267 followers, saw her child’s post attract nearly 7,000 viewers.

Another mother whose 13-year-old child’s photo was similarly featured remarked:

Meta categorized such posts as “recommended tools,” insisting that public posts should serve this function.

A corporate representative stated, “The shared images do not violate our policy and are postings from schools published by parents. There are systems in place to prevent teenagers from recommending shared threads.”

The 37-year-old London Instagram user who received the post requested to remain anonymous.

Skip past newsletter promotions

He noted he had not liked or posted similar images prior to receiving photos of the schoolgirl.

“To me, this feels intentionally provocative, introducing trendy and popular content, ultimately exploiting the children and families involved and jeopardizing online safety.”

“We advocate for children’s rights online,” said Beeban Kidron, a crossbench peer and advocate.

“Meta prioritizes corporate growth over children’s rights to privacy, which is the only reason we can fathom for sending a photograph of a female student to a 37-year-old man.”

She urged the Regulator of Communications to consider implementing measures this summer to prevent unidentified adults from contacting children, stating that “companies should not have the right to feed sensitive content to children.”

Ofcom’s illegal harm initiatives aim to combat online grooming, emphasizing that “Your child’s profile, location, and connections should not be visible to others.”

Meta’s system suggests that if a thread profile is public, posts from an adult profile could be recommended on Facebook or Instagram, allowing others to “discover, follow, and interact with you.” Users can turn this off or switch a thread profile to private.

Source: www.theguardian.com

Thousands of UK University Students Use AI to Combat Fraud

In recent years, a substantial number of university students in the UK have been identified for misusing ChatGPT and similar AI tools. While traditional forms of plagiarism appear to be declining significantly, a Guardian investigation reveals concerning trends.

The investigation into academic integrity violations has indicated a rise to 5.1 cases per 1,000 students, with nearly 7,000 verified instances of fraud involving AI tools reported between 2023 and 2024. This marks an increase from just 1.6 cases per 1,000 students in the previous academic year, 2022-23.

Experts anticipate these figures will increase further this year, estimating potential cases could reach around 7.5 per 1,000 students, although reported cases likely reflect only a fraction of the actual instances.

This data underscores the rapidly changing landscape for universities as they strive to update evaluation methods in response to emerging technologies like ChatGPT and other AI-driven writing tools.

Before the advent of generative AI in the 2019-20 academic year, plagiarism accounted for nearly two-thirds of all academic misconduct. Plagiarism rates surged during the pandemic as many assessments transitioned online. However, with advances in AI tools, the character of academic fraud has evolved.

Predictions suggest that for the current academic year, confirmed instances of traditional plagiarism could decrease from 19 per 15.2 to 15.2, falling to approximately 8.5 per 1,000 students.

A set of charts displaying verified fraud cases per 1,000 students. Plagiarism is expected to rise from 2019-20 to 2022-23 and then revert, while AI-related fraud is anticipated to rise from 2022-23 to a level comparable to plagiarism. “Other fraud” shows stability.

The Guardian reached out to 155 universities via the Freedom of Information Act, which mandates disclosure of confirmed cases of academic misconduct, including plagiarism and AI-related fraud over the past five years. Out of these, 131 responded; however, not all universities had comprehensive records of annual or fraud categories.

More than 27% of responding institutions did not categorize AI misuse as a distinct form of fraud in 2023-24, indicating a lack of acknowledgment of the issue within the sector.

Numerous instances of AI-related fraud may go undetected. A survey by the Institute for Higher Education Policy revealed that 88% of students admitted to utilizing AI for evaluations. Additionally, last year, researchers at the University of Reading tested their rating system and found that AI-generated submissions went undetected 94% of the time.

Dr. Peter Scarf, an associate professor of psychology at the University of Reading and co-author of the research, noted that while methods of cheating have existed for a long time, the education sector must adapt to the challenges posed by AI, creating a fundamentally different issue.

He remarked, “I believe the reality we see reflects merely the tip of the iceberg. AI detection operates differently from traditional plagiarism checks, making it almost impossible to prove misuse. If an AI detector indicates AI usage, it’s challenging to counter that claim.”

“We cannot merely transition all student assessments to in-person formats. Simultaneously, the sector must recognize that students are employing AI even if it goes unreported or unnoticed.”

Students keen to avoid AI detection have numerous online resources at their disposal. The Guardian found various TikTok videos that promote AI paraphrasing and essay writing tools tailored for students, which can circumvent typical university AI detection systems by effectively “humanizing” text produced by ChatGPT.

Dr. Thomas Lancaster, a researcher of academic integrity at Imperial College London, stated, “It’s exceedingly challenging to substantiate claims of AI misuse among students who are adept at manipulating the generated content.”

Harvey*, who has just completed his Business Management degree at Northern University, shared with the Guardian that he utilized AI for brainstorming ideas and structuring tasks while also incorporating references, noting that many of his peers have similarly engaged with these technologies.

“When I started university, ChatGPT was already available, making its presence constant in my experience,” he explained. “I don’t believe many students use AI simply to replicate text. Most see it as a tool for generating ideas and inspiration. Any content I derive from it, I thoroughly rework in my style.”

“I know people who, after using AI, enhance and adapt the output through various methods to make it sound human-authored.”

Amelia*, who has just completed her first year in a music business program at a university in the southwest, also acknowledged using AI for summarization and brainstorming, highlighting the tool’s significant benefits for students with learning difficulties. “A friend of mine uses AI for structuring essays rather than relying solely on it to write or study, integrating her own viewpoints and conducting some research. She has dyslexia.”

Science and Technology Secretary Peter Kyle recently emphasized to the Guardian the importance of leveraging AI to “level the playing field” for children with dyslexia.

It appears that technology companies see students as a key demographic for their AI solutions. Google is now providing free upgrades to university students in the US and Canada for 15 months to its Gemini Tools.

Lancaster stated, “Assessment methods at the university level may feel meaningless to students, even if educators have valid reasons for their structure. Understanding the reasons behind specific tasks and engaging students in the assessment design process is crucial.”

“There are frequent discussions about the merits of increasing the number of examinations instead of written assessments, yet the value of retaining knowledge through memorization diminishes yearly. Emphasis should be on fostering communication skills and interpersonal abilities—elements that are not easily replicable by AI and crucial for success in the workplace.”

A government spokesperson stated that over £187 million has been invested in the national skills program, with guidelines issued on AI utilization within schools.

They affirmed: “Generative AI has immense potential to revolutionize education, presenting exciting prospects for growth during transitional periods. However, integrating AI into education, learning, and assessment necessitates careful consideration, and universities must determine how to harness its advantages while mitigating risks to prepare for future employment.”

*Name has been changed.

Source: www.theguardian.com

The social struggles of Australian college students: What are they missing out on at universities?

Whenmai* began studying psychology in mid-2019. She looked forward to traveling to college to have a lively conversation with her classmates working on new ideas.

However, when her in-person tutorials were exchanged for a Zoom meeting in 2020, her excitement turned into horror.

“People don’t switch cameras. They have their names displayed,” says Mai. “It’s very lonely and very isolated. If you’re struggling with questions, then no one will talk.”

The auditorium, once full of students, was emptied in favor of pre-recorded lectures, Mai said. Even the lab demonstrations have been replaced by a lively, undirected Zoom breakout room.

Mai sat through an online class who fell silent mid-hour time slot as the instructor sued a grid of faceless viewer names to engage in simple questions.

“No one spoke,” she says. “It’s so annoying, it hurts so much, you just want to go out.”

As soon as she graduated, Mai moved to Hobart to study medicine. The lockdown was fading memory and she was expecting a packed campus.

But apart from her medication classmates, she says it was abandoned – it remains two years later.

“I had this very naive vision, ‘Oh, wow, I’m going to meet a lot of students from many different places.” [but] Many students don’t attend just because they have other work or life commitments,” she says.

Australian students like MAI enroll in college in the hopes of experiences that many facilities no longer offer. They imagine themselves in time and space exploring big ideas with their peers and teachers, share vibrant discussions and share their path to becoming independent adults.

Those who can’t afford to spend the whole week on campus or are not given the option of in-person classes are worried that they are missing out. High quality education While increasing fees are being charged.

Students under financial pressure cut classes and picked up more jobs. The result is a malicious cycle of lower campus attendance, according to students. Few students will be taking part in the class in person, and attending classes is even less attractive and there will be fewer face-to-face opportunities for universities as they do not appear.

Like Mai, many people ask themselves: “What is the point of going to campus?”


AAccording to Dr. Thuc Bao Huynh, a researcher at the Center for Youth Policy and Education Practice at Monash University, Ustralian’s expectations for university life go back to research ideals before the 1980s, before the 1980s.

“If someone was a student, they wouldn’t actually do that much except they were students,” he says. “That’s not the case anymore.”

The myth of campus life opposes the modern reality where increasingly fewer students have the luxury of their own research and social life, their primary responsibility. Since the 1990s, the number of Australians from a wider background has increased. Costs of living are accelerating this trend, Huynh said it forces more students to treat the university as a part-time commitment.

As Rent and living expenses There is a share of students with jobs rising According to the Some analysis. Almost half of all students chose I’ll be studying part-time instead of full-time in 2023.

Those who can’t afford to spend the whole week on campus, or who don’t have in-person classes on the course, are worried that they are being charged a growing number of fees, rather than overlooking a quality education. Photo: Agenzia Sintesi/Alamy

“Being a student is 1740532059 I mashed with everything else that young people are experiencing,” Huyn says. “That’s another thing they have to deal with.”

Classmates at Jed Brockhouse who struggled to work juggle college and were given the option to do coursework online will not be able to come to campus at Latrobe University in North Melbourne.

“If you know you don’t need to be there, why do you sit in class for two hours, fit in an hour of traffic?” he says.


Sam Lane only learned how much he missed when he took a break from law classes and reached out to art history.

He says he went to university in 2019 looking for a photo of his parents’ campus life. To people. “

Skip past newsletter promotions

Lane got a glimpse into the fantasy world at the University of New South Wales Art School on Sydney’s bustling Oxford Street. His long three-hour art history class forced him and his peers to meet face to face or gave him time to tackle and discuss aesthetics and philosophy.

“You didn’t feel like you were there just to get bitten and spit out of the auditorium,” he says. “If there’s something interesting for the class, you can dig into it.”

However, once his art tutorial is over, Lane has to hurry back to the main campus of UNSW in eastern Sydney, reducing the teaching time on campus, and the professor has rushed through an impossible amount of information. Ta.

Sam says he went to university in 2019 looking for photos of campus life drawn by his parents. Photo: Lisa Marie Williams/The Gardian

“We don’t have time to chat, we don’t have time to get to know the people around you,” he says. “You’re too busy trying to get through all the content very quickly.”

Lane is now approaching the end of his degree and watching attendees get nervous with the Student Association, stopping the long-term party tradition, including Battle of the Band events.

“People want to understand what they have lost and really go back to it… [but] There’s no good time to put it [on] “There aren’t enough people on campus so they don’t get voter turnout,” he says. “I’m a little dead.”


THis tendency towards online learning reflects wider pressure. A decline in federal funding and threatened losses from international students, the main source of income, forced the university to save. At the same time, university staff teach 200,000 students more than they did a decade ago.

Kaab Qureshi, a sophomore at Australian National University in Canberra, says it’s difficult to learn in classes that have become “strange” as the university is cut and contact time condensed.

“They just want to cut costs as quickly as possible,” he says. “I think they’ve made more reputation and profitable than student involvement and support.”

Even the face-to-face class stuffing didn’t stop them from finding the community they wanted for those who could afford to stroll around campus.

Kristy Sauw, a classmate at ANU in Qureshi, says his first year in college wasn’t good. After moving from Wagga Wagga High School to the on-campus residential hall, it was easy for her to make friends and go to classes in person.

“We made a lot of friends in our philosophy tutorials because we considered it an hour to yap. It was really fun,” she says. “As much as we focus on what we’re actually talking about, we’ve also become bonded and talked about random topics.”

Qureshi spent extra money to live in the residential hall for his first year, but the

Source: www.theguardian.com

Survey reveals significant decline in the participation of female students in computing GCSEs in England

The number of girls studying computing GCSEs in England has more than halved in less than a decade, leading to warnings about “male dominance in shaping the modern world”.

The sharp fall in female participation comes as government changes to qualifications see the old Information and Communications Technology (ICT) GCSE abolished and replaced with a new Computer Science GCSE.

Government reforms aimed to create “more academically challenging and knowledge-based” qualifications, but the introduction of the new curriculum had the unintended consequence of reducing female enrolments, new research from King’s College London has found.

In 2015, 43% of ICT GCSE candidates were women, but in 2023, just 21% of those taking GCSE Computer Science were women.

To put the figures in perspective, 40,000 girls took ICT GCSEs and a further 5,000 took Computer Science in 2015. By 2023, with ICT no longer available, just 18,600 girls will have taken Computer Science.

When asked why, girls who chose not to study computer science said they didn’t enjoy it and that it didn’t fit into their career plans, the survey found.

Critics of the old ICT qualification complained that they only taught students how to use Microsoft Office. In contrast, the new Computer Science GCSE, with its emphasis on computer theory, coding and programming, is perceived by many students as “harder” than other subjects.

The study recognised that computer science GCSEs are here to stay, with 88,000 students taking the subject in 2023, and a four-fold increase in the number of A-level candidates between 2013 and 2023.

“However, these successes coincide with a general decline in computer and digital skills education at secondary school level, particularly affecting girls, certain ethnic groups and students from disadvantaged socio-economic backgrounds,” the report said.

The report included a series of recommendations calling for urgent curriculum reform, more support for computing teachers and “expanding the current narrative about computing to focus on more than just male tech entrepreneurs.”

“The lack of women in the computing industry could lead to increased vulnerability and male dominance in shaping the modern world,” the authors warned.

Skip Newsletter Promotions

“There is an urgent need for action to encourage more girls to study computing at school so they can gain the digital skills they need to participate in and shape the world,” said Dr Peter Kemp, lead researcher on the study and senior lecturer in computing education at King’s College London.

“Current GCSEs focus on developing computer science and programming skills and this appears to be preventing young people, particularly girls, from taking up the subject. We need to ensure that computing is attractive to all pupils and meets the needs of young people and society.”

“All students should leave with the digital skills they need to succeed in the workplace and society,” says Pete Dolling, head of computing at Fulford School in York. “The curriculum needs to be reformed to include a comprehensive computing GCSE that provides essential skills and knowledge, going beyond just computer science.”

Maggie Philbin, One The technology broadcaster and director of TeenTech, which promotes digital skills, added: “At the moment many students consider the subject to be ‘difficult’ and will vote with their feet if they want to achieve the best results. It’s time to look at this subject with a fresh eye and work with teachers to design a curriculum that is more engaging and that teachers can be confident delivering.”

Source: www.theguardian.com

Students Implicated in Cyber Fraud After Police Discover Involvement in Massive Phishing Site

Police have uncovered a disturbing trend among university students, who are resorting to cyber fraud to boost their income. They have managed to infiltrate a large phishing site on the dark web that has defrauded tens of thousands of individuals.

The site, known as LabHost, has been operational since 2021 and serves as a hub for cyber fraud, enabling users to create realistic-looking websites mimicking reputable companies like major banks. It has ensnared 70,000 users globally, including 70,000 individuals in the UK.

Victims unknowingly provided sensitive information, which was then used to siphon money from their accounts. The perpetrators behind the site profited by selling this stolen data on the dark web to other fraudsters.

According to the Metropolitan Police, the primary victims fall within the 25-44 age bracket, with a significant portion of their activities carried out online.

Law enforcement authorities have apprehended one of the alleged masterminds behind the site, along with 36 other suspects detained in the UK and abroad. The arrests were made at various airports in Manchester, Luton, Essex, and London.

British police are facing mounting pressure to demonstrate their effectiveness in combating the rising tide of cyber fraud.

Despite the relatively small impact of dismantling this particular site, the police intend to dismantle additional cyber fraud operations to undermine the confidence of criminals who believe they can act with impunity.

While fraud and cybercrime present considerable challenges for law enforcement agencies, they often compete for resources with other policing priorities, such as safeguarding children and enhancing women’s safety.

LabHost managed to amass significant amounts of sensitive data, including 480,000 debit or credit card numbers and 64,000 PIN numbers, generating over £1 million in membership fees from 2,000 individuals who paid in cryptocurrency.

The company lured users with tutorial videos on committing crimes using the site and on utilizing new consumer products. It promised quick installation of software in five minutes and offered “customer service” in case of any issues.

DI Oliver Richter noted the shift in cyber fraud from requiring technical skills like coding to now being accessible to individuals ranging from late teens to late 20s, many of whom are college students.

He expressed concern that these users may not fully grasp the risks and consequences of their actions, assuming anonymity and ease of operation.

Following the dismantling of the site, 800 users received warnings that the police were aware of their activities.

Detective Inspector Helen Rance, head of the Metropolitan Police’s cybercrime unit, described the LabHost bust as a sophisticated operation targeting those who have commercialized fraudulent activities. She highlighted collaboration with 17 factions globally, both in the public and private sectors.

She emphasized the success of penetrating the service, identifying the perpetrators, and understanding the scale of their illicit operations.

Source: www.theguardian.com

Cutting-Edge UK University Amazes Students with Hologram Lecturer Technology

Any university lecturer will tell you that getting students to come to their morning lectures is a real struggle.

But even the most hungover beginner is sure to be captivated by Albert Einstein’s physics lesson or Coco Chanel’s design masterclass.

This could soon be the case for students in the UK, with some universities inviting guest lecturers from around the world using the same holographic technology used to bring deceased singers back on stage. It’s starting to happen.

Loughborough University, which was the first in Europe to consider applying the technology, has used the technology to bring in sports scientists from the Massachusetts Institute of Technology (MIT) to teach fashion students how to create immersive shows and teach management students how to create immersive shows. plans to test how to handle difficult business situations.

Professor Vicky Locke, dean of Loughborough Business School and who is leading the rollout of the technology, said students “absolutely love” the technology and want to take selfies with it. said. They would prefer “a guest speaker from the industry who walks into the classroom with a smile on their face rather than a two-dimensional person on the wall,” she added.

The Zoom calls made students “feel like they were watching TV… it felt distant,” she said. For them, holographic images are more appealing and realistic. ” The technology will be officially introduced into the curriculum in 2025 after a year of experimentation.

The box-based holographic unit is sold by L.A.-based company Proto. The company’s customers include companies such as his BT and IBM, and it is used in meetings to reduce the need for business travel. We are also collaborating with Stockholm fashion retailer H&M to create an interactive product display.

David Nussbaum, who founded Proto four years ago after working on holograms of deceased celebrities, says his company could soon bring some of the 20th century’s greatest thinkers back from the dead. Told.

He added: “Proto has technology that projects images of Stephen Hawking and other people to make it seem as if he’s really there. We use it in books, lectures, social media, etc. You can connect it to something you were attached to, a question you asked, an interaction you had with him, etc. AI Stephen Hawking looks just like him, sounds like him, and interacts as if it were him. To do.

“It’s awe-inspiring and mind-blowing. I’m shocked at how great the interactions are. Whether people like it or not, AI is part of our lives.”

He added that his company’s ambition is to prove that “you don’t have to be an eccentric billionaire or celebrity to have a hologram.”

Gary Barnett, Professor of Digital Creativity at Loughborough University, who is also leading the implementation, said:

Skip past newsletter promotions

“Students need to understand what it means to use them, to be in that world, to experience them, to interact with them, and all that they will need for their future careers.”

Professor Rachel Thomson, the university’s vice-chancellor and advocate, said the technology could reduce the need to bring in guest speakers at short notice, encourage international research collaboration and reduce the amount of teaching materials used by students. He said it will help achieve sustainability strategies. Building prototypes in engineering, design and creative arts.

It also allows instructors to display complex equipment, such as engines, more easily than over a video call.

Nussbaum said corporations and large institutions such as universities are the first step in his company’s plans, but he hopes to roll out mini-units costing less than $1,000 within the next 18 months, which It will show a miniature image that he likens to the movie “Wonka Vision.” “Charlie and the Chocolate Factory” by Roald Dahl.

He added that the technology’s AI capabilities meant it was possible to create an avatar that looked like anyone in the world, but noted that this could come with legal complications.

Source: www.theguardian.com