End-to-End Encryption: The Ultimate Security Solution of the Century

Everyone has secrets to protect. In today’s digital age, whether safeguarding personal messages, business communications, or confidential state information, end-to-end encryption (E2EE) offers essential security and peace of mind.

E2EE ensures that your communications remain private from internet service providers and the operators of messaging or video conferencing applications. Messages are encrypted on the sender’s device and only decrypted by the recipient, making them unreadable to unauthorized parties while in transit. This prevents access by any entity, including law enforcement or corporate insiders.

Digital encryption is rooted in robust mathematics rather than mere assurances. The RSA algorithm, introduced in 1977, pioneered modern encryption by relying on the complexity of factoring large numbers into their prime components. Since then, various algorithms have emerged, utilizing intricate mathematics to enhance cryptographic security.

The true strength of E2EE lies not just in its technical implementation, but in how it upholds democracy and human rights across the globe. As Matthew Feeney from the UK privacy group Big Brother Watch states, “There are individuals in perilous regions depending on encryption to preserve their lives.” Additionally, even in recognized democracies, freedom is vulnerable. Feeney warns that those who claim “I have nothing to hide” should take heed of history’s lessons.

Many governments view E2EE unfavorably because it blocks surveillance, similar to how postal services safeguard letters. Notably, UK governments have attempted to ban E2EE; most recently, Prime Minister Keir Starmer reversed a controversial request for a backdoor into Apple following a public outcry.

Feeney acknowledges the uncertainty surrounding the potential for E2EE to be compromised, as intelligence agencies typically do not disclose their capabilities. Concerns loom regarding the advent of quantum computing, which may soon breach current encryption algorithms. However, cryptography continues to evolve, with emerging mathematical solutions challenging outdated algorithms. “Governments may wield power, but they can’t override the laws of mathematics,” Feeney asserts.

Topics:

This rewrite optimizes the content for SEO, ensuring clarity, keyword inclusion, and readability while preserving the original structure and HTML tags.

Source: www.newscientist.com

Where Did You Store It? Losing the Crucial Encryption Key Could Nullify the Election.

Feedback A weekly column filled with strange stories, unbelievable marketing claims, and perplexing instructions.

Locked Out

Feedback believes that the expression “couldn’t make up for it” is frequently misinterpreted. This doesn’t imply a limit to creativity; rather, it signifies that some developments can’t fit into a fictional narrative because people insist, “That will never happen.” The issue is, those individuals are mistaken, as reality can be quite absurd.

One of the key organizations in cryptography is the International Association for Cryptographic Research. This non-profit organization is committed to advancing cryptographic science. Recently, the IACR held elections for new officers and board directors while also adjusting its bylaws. They executed this ingeniously, leveraging their expertise as cryptographers. They utilized an online voting platform called Helios, promising “verifiable online elections.”

Helios is remarkably efficient. Every vote is recorded, ensuring that your vote is counted and remains unchanged. Tampering is thus rendered impossible. Simultaneously, every vote is kept confidential. The system employs advanced encryption technology, where all encrypted votes are compiled into an encrypted tally, and only the tally is decrypted.

But how is this tally decrypted? An organization has to appoint multiple directors. The IACR chose three individuals, each granted one-third of the cryptographic keys. All three needed to enter their portions of the keys to decipher the tally and view the results. This was a complete process—partial decryption wasn’t an option.

Then the expected occurred. “Unfortunately, one of the three trustees lost their private key irretrievably, an honest but unfortunate mistake, and as a result, we are unable to compute the decryption share.” I wrote to the IACR on November 21st. “Consequently, Helios was unable to finalize the decryption process, making it technically impossible to obtain or verify the election’s final outcomes.”

The IACR was compelled to: void the elections and restart the entire process. They now plan to “implement a two-out-of-three threshold mechanism for managing private keys and distribute clear written procedures for all trustees to follow before and during elections.” Feedback is eager to scrutinize that “documented procedure” even if its sole instruction is “Remember” in large bold letters on the cover page.

We are also intrigued by how what the IACR calls “human error” can undermine even the most meticulously designed systems. Each time a Silicon Valley entrepreneur speaks of the imminent emergence of human-level artificial intelligence, we internally cringe. The first iteration of such AI will probably align with the average human’s capabilities.

Floating Like a Raisin

The creativity of science journalists in coming up with new and engaging units of measure is always impressive. On November 17th, New York Times featured a story about “a small radio-frequency tag powered by solar energy that weighs just 60 milligrams and retails for $200.” Entomologists utilize this tag to monitor monarch butterflies during their migration across North America.

Anthony Weaver highlighted a sentence aimed at illustrating how heavy the tag is compared to the butterfly. “Most monarchs weigh between 500 and 600 milligrams, meaning each tagged transcontinental traveler is like half a raisin carrying three grains of uncooked rice.”

In the feedback section, we can all agree that this explanation is far clearer than stating “about 1/10 of body weight” could manage. Or as Anthony phrased it, “When I envisioned myself as a half-raisin on a cross-continental trek with rice to Mexico, I genuinely grasped how butterflies relate to science.”

No, this is not a call for you to submit similar examples from your page. New Scientist. Don’t even think about it.

Boys Club

The reason you may not receive feedback on social media is simply because you lack the mental bandwidth to figure out how to stand out on six different platforms with entirely different algorithms. Nonetheless, we half-watch things, which is why I found the spontaneous experiment on LinkedIn intriguing. Women in the field altered their names and pronouns to appear male, resulting in a spike in engagement.

For instance, social media consultant Simone Bonnet observed a 1600 percent rise in profile views after changing her pronouns to “he/him” and her name to “Simon E.” According to The Guardian, others experienced similar increases. Meanwhile, Daniel Hyers, who possesses an ideal LinkedIn name, attempted the reverse by “changing his name to Daniela for 4 days.” Day 1: a -26% drop.”

Sakshi Jain from LinkedIn stated that feedback indicates the platform’s “algorithms and AI systems are not utilized.” Demographic data (age, race, gender, etc.) are factored in as signals to assess the visibility of content, profiles, and posts in users’ feeds. We don’t dispute that, but we also theorize that unintended emergent outcomes play a significant role. Algorithm bias.

Meanwhile, Feedback is in the midst of creating a new LinkedIn page, which we will dub Mansplaining.

Got a story for Feedback?

You can email your article to Feedback at feedback@newscientist.com. Please include your home address. This week’s and past feedback are accessible on our website.

Source: www.newscientist.com

Former UK Cyber Chief believes it is “unrealistic” to demand Apple to break encryption

Apple withdraws one of its crypto services from its UK customers

Slandstock / Alamy

The former cybersecurity chief called the UK government “naive” for Apple’s request to add a backdoor to its software. This allows the UK Intelligence Agency to search customer data.

Ciaran Martin He is the head of cybersecurity at the UK Government Communications Headquarters (GCHQ), and was the first CEO of the National Cybersecurity Centre (NCSC) before joining Oxford University in 2020. New Scientist On reports that the UK government has made an unprecedented request to grant Apple access to data stored anywhere in the world, even if it is encrypted.

Such an order, made under the Investigation Powers Act of 2016, is intended to be made in secret, but Martin says it’s not surprising that details appear to have been leaked. “I think the idea that this type of order for companies like Apple would work secretly was probably naive,” he says.

Neither the Home Office nor Apple has confirmed the existence of requests. However, in February, Apple announced that it would do so. No longer provide advanced data protection servicesIt is designed to securely encrypt cloud data to new users in the UK. “As I’ve said many times before, we’ve never built a backdoor or a master key for our products or services and never would,” Apple said at the time. The same goes for the company Reportedly challenges British orders in legal cases that are likely to be heard secretly.

Martin says that while it’s not uncommon for governments and industries to collide with security issues, he is “not cumbersome, but susceptible to some form of compromise.” He says several times during his career at Intelligence Reporting Agency, technology companies have requested that malicious actors remove features used to harm national security or criminal enterprises. He refused to give details But they often said these are small specialized technology providers.

“They’ll have a new app or something, and it will become a criminal favourite for certain features, and you just say, ‘Look, you can’t do this,'” says Martin. “They are little niche technology, they are widely used. They are more misused than they are used.

At the end of the day, he says, the government must accept that non-crackable encryption will remain here. “The ship sailed,” says Martin. “I think the government has to agree to this in the end, and I think in the long run, I’m trying to force a global Titan. [US] The West Coast is not going well. ”

topic:

Source: www.newscientist.com

Signal’s Meredith Whitaker discusses how encryption poses a significant challenge to authority

Meredith Whittaker practices what she preaches: As president of the Signal Foundation, she’s a vocal advocate for privacy for all. But she doesn’t just spout empty words.

In 2018, she came to the public’s attention as one of the organizers of the Google walkouts, mobilizing 20,000 employees at the search giant in a dual protest against state-sponsored surveillance and sexual misconduct misconduct.

Whitaker remains passionate about privacy after five years in the public eye, including as a congressional testifier, a university professor, and an adviser to federal agencies.

For example, it’s not uncommon for business leaders to politely respond when asked about salary on the resumes accompanying these interviews. Flat-out refusal to answer questions about age or family is less common. “As a privacy advocate, Whitaker won’t answer personal questions that could lead to guessing passwords or bank authentication ‘secret answers,'” a staffer told me after the interview. “And she encourages others to do the same!”

When she left Google, Whitaker issued a memo to the company announcing her commitment to the ethical adoption of artificial intelligence and to organizing for a “responsible tech industry.” “It’s clear to me that Google is not the place for me to continue doing this work,” she said. That clarity and refusal to compromise sent a signal.

The Signal Foundation was founded in 2017 with $50 million in funding from WhatsApp co-founder Brian Acton, and its mission is to “protect freedom of expression and enable secure global communications through open source privacy technology.”

The company took over development of messaging app Signal in 2018, and Whitaker took on the newly created role of president in 2022. The timing was just right to start defending Signal, and encryption in general, against a wave of attacks from nation states and corporations around the world.

While laws such as the UK’s Online Safety Act (OSA) and the EU’s Child Sexual Abuse Regulation contain language that can be used to block or decrypt private communications, Meta’s proposal to introduce end-to-end encryption on Facebook and Instagram drew strong backlash from politicians such as Priti Patel, who, as UK Home Secretary, called the plans “devastating”.

Whitaker said these attacks are not new. Observer “Going back to 1976, [Whitfield] Diffie and [Martin] Hellman was about to publish a paper introducing public key cryptography, a technology that would allow encrypted communication over the Internet, and intelligence agencies were trying to stop him.

“Throughout the ’80s, the N.S.A. [US National Security Agency] So GCHQ lost its monopoly on encryption and by the ’90s it was all governed under military treaties. This was the ‘code wars’. You couldn’t mail code to someone in Europe, it was considered a munitions export.”

But the larger push towards commercializing the internet forced a degree of softening: “It allowed transactions to be encrypted, allowing big companies to choose exactly what to encrypt. At the same time, the Clinton administration endorsed surveillance advertising as a business model, creating incentives to collect data on customers in order to sell it to them.”

Surveillance, she says, has been a “disease” since the dawn of the internet, and encryption poses “a serious threat to the type of power that shapes itself through these information asymmetries.” In other words, she doesn’t see the fight ending anytime soon: “I don’t think these arguments are honest. There are deeper tensions here, because in the 20 years since this metastatic tech industry developed, we’ve seen every aspect of our lives subject to mass surveillance by a small number of companies that, in partnership with the US government and other ‘Five Eyes’ agencies, collect more surveillance data than any organization in the history of humanity has ever had.”

“So if we continue to defend these little pockets of privacy and don’t eventually expand them, and we have to fight back a little bit to get a little bit more space, I think we’re going to have a much darker future than if we defended our position and were able to expand the space for privacy and free communication.”

Criticisms of encrypted communications are as old as the technology itself: allowing everyone to talk without nation states being able to eavesdrop on the conversation is a godsend for criminals, terrorists, and pedophiles around the world.

But Whittaker argues that some of Signal’s strongest critics seem inconsistent about what they care about: “If they are really interested in helping children, why are Britain’s schools collapsing? Why have social services been funded with just 7% of the amount proposed to fully fund agencies on the front line of preventing abuse?”

Sometimes the criticism is unexpected. Signal was recently drawn into the US culture wars after a right-wing campaign to unseat National Public Radio’s new CEO, Katherine Maher, was expanded to include Signal, where Maher serves as a director, after failing. Elon Musk joined in, and the Signal app… He once promoted it In response to claims that the app was “potentially compromised,” the company noted that the app had “known vulnerabilities.”

Whitaker said the allegations are “a weapon in the propaganda war to spread disinformation. We are seeing similar disinformation related to the escalation of the conflict in Ukraine that appears to be designed to move people away from Signal. We believe these campaigns are designed to direct people to less secure alternatives that are more susceptible to hacking and interception.”

The same technology that has drawn criticism for the foundation is also popular among governments and militaries around the world who need to protect their communications from the prying eyes of nation-state hackers and others.

Whittaker sees this as a leveller: Signal is for everyone.

“Signal is either for everyone or it’s for no one. Every military in the world uses Signal, every politician I know uses Signal, every CEO I know uses Signal, because anybody who has to do really sensitive communication knows that storing it in plaintext in a Meta database or on a Google server is not a good practice.”

Whittaker’s vision is singular and not one to be distracted: Despite her interest in AI, she is cautious about combining it with Signal and has been critical of apps like Meta’s WhatsApp that have introduced AI-enabled features.

“I’m really proud that we don’t have an AI strategy. We have to look at ourselves and say, where is the data coming from to train our models, where is the input data coming from? How do we have an AI strategy when our focus is on protecting privacy, not surveilling people?”

Whatever the future holds in terms of technology and political attitudes towards privacy, Whittaker is adamant that the principle is an existential issue.

“We will do the right thing. We would rather go bankrupt to stay in business than undermine or backdoor the privacy guarantees that we promise people.”

resume

Year No Comment.
family No Comment.
education I studied Literature and Rhetoric at Berkeley, then joined Google in 2006 and learned the rest of my education there.
pay No Comment.

Source: www.theguardian.com