At its Worldwide Developers Conference on Monday, Apple unveiled for the first time its vision to equip its product line with artificial intelligence. A key feature that applies to virtually the entire product range is Apple Intelligencea suite of AI-powered capabilities that promises to provide personalized AI services while keeping sensitive data safe.
This represents Apple’s biggest step forward. in using our personal data to help AI perform tasks for us. To prove it can do this without sacrificing privacy, the company says it has created a new way to manage sensitive data in the cloud.
Apple says its privacy-focused system will first try to perform artificial intelligence tasks locally on the device itself. If data is exchanged with cloud services, it will be encrypted and subsequently deleted. The company also claims that the process, which it calls Private Cloud Computing (private cloud computing) will be subject to review by independent security researchers.
The proposal assumes a hidden contrast with companies such as Alphabet, Amazon or Meta, which collect and store huge amounts of personal data.. Apple says that any personal data transferred to the cloud will be used solely to complete the AI task and will not be stored or accessed by the company, including for debugging or quality control, once the model is completed.
Simply put, Apple says people can trust it to analyze incredibly sensitive data – photos, messages and emails containing intimate details of our lives – and offer automated services based on what it finds there. without actually storing the data online and without making it vulnerable..
He showed several examples of how this will work in future versions of iOS. For example, instead of searching through Messages for a podcast your friend sent you, you can ask Siri to find it and play it for you. Craig Federighi, senior vice president of engineering at the company software
from Apple explained another situation: You received an email saying a work meeting was delayed, but your daughter was performing in a play that evening. Now your phone can find a PDF of performance information, predict local traffic, and tell you if it’s going to arrive on time. These capabilities will extend beyond Apple-built apps and allow developers to also take advantage of Apple’s artificial intelligence.Since the company receives more benefits from Hardware and advertising services, has fewer incentives than others to collect personal data online, allowing it to position the iPhone as the most private device. Despite this, Apple is already the focus of privacy advocates. Security flaws led to explicit photos being leaked from iCloud in 2014. In 2019, contractors were caught listening to intimate recordings of Siri for quality control. Controversy continues over how Apple handles data requests from law enforcement.
According to Apple, the first line of defense against privacy violations is Avoid cloud computing for AI tasks whenever possible. “The cornerstone of personal intelligence is on-device processing,” Federighi says, meaning many AI models will run on iPhones and Macs rather than in the cloud. “It knows your personal data without collecting your personal data.”
This creates some technical obstacles. Two years later boom Thanks to artificial intelligence, querying models for even simple tasks still requires enormous computing power. Achieving this with the chips used in phones and laptops is difficult, so the company’s phones can only run the smallest of Google’s artificial intelligence models, with everything else done through the cloud. Apple says its ability to perform on-device AI calculations is based on years of chip design research that has led to M1 chips which began to roll out in 2020.
However, Even Apple’s most advanced chips can’t handle the full range of tasks the company promises to solve using artificial intelligence.. If you ask Siri to do something complex, it may have to pass that request along with your data to models that are only available on Apple servers. This move, according to security experts, presents a large number of vulnerabilities which could expose your information to outside attackers, or at least to Apple itself.
“I always warn people that once the data leaves the device, it becomes much more vulnerable,” said Albert Fox Kahn, executive director of the Surveillance Technology Oversight Project and a tenured professor at New York University School of Information Law Institute. rights (USA).
Apple says it has mitigated this risk with its new private cloud computing system. “For the first time in history, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud,” Apple security experts wrote in their announcement, declaring that personal data “is not accessible to anyone but the user.” even Apple.” How it works?
Apple has historically encouraged people to choose end-to-end encryption (the same type of technology used in messaging apps like Signal) to protect sensitive iCloud data. But this doesn’t work with AI. Unlike messaging apps, where a company like WhatsApp doesn’t need to see the content of your messages to send them to friends, Apple’s AI models require unencrypted access to underlying data to generate responses.. This is where Apple’s privacy process comes into play. First of all, Apple claims that the data will only be used for the task at hand. In second place, This process will be verified by independent researchers.
Needless to say, the architecture of this system is complex, but you can think of it as encryption protocol. If your phone determines that it needs help from a larger AI model, it will package a request containing the query it is using and the specific model, and then block that request. Only the specific AI model being used will have a corresponding key.
When asked about MIT Technology Review O Whether users will be notified when a particular request is sent to cloud-based AI models instead of being processed on the device.An Apple spokesperson said there would be transparency for users, but there were no further details.
Don Song, co-director of the Center for Responsible Decentralized Intelligence at the University of California, Berkeley, and an expert in private computing, says Apple’s news is encouraging. “The list of their stated goals is well thought out,” he says. “Of course, there will be some challenges to achieving these goals.”
Kahn says that based on what Apple has already revealed, the system appears to be much more secure than other artificial intelligence products currently in existence. However, a common refrain in their space is “Trust but verify.” In other words, We won’t know how securely these systems store our data until independent researchers verify their claims.as Apple promises, and the company is responding to its findings.
“Opening it up to independent review by researchers is a big step,” he says. “But that doesn’t determine how you’ll react when researchers tell you something you don’t want to hear.” Apple did not answer questions MIT Technology Review about how the company will evaluate the researchers’ comments.
Apple isn’t the only company betting that many of us will give AI models virtually unlimited access to our personal data. if it means they can automate tedious tasks. OpenAI’s Sam Altman describes his dream AI tool MIT Technology Review as someone “who knows everything about my life, every email, every conversation I’ve ever had.” At its own developer conference in May, Google announced Project Astra, an ambitious project to create a “universal AI agent useful in everyday life.”
It’s a gamble that will force many of us to think for the first time about what role, if any, we want AI models to play in how we interact with our data and devices. When ChatGPT first came on the scene, we didn’t have to ask ourselves this question. It was just a text generator that could write us a greeting card or a poem, and the questions it raised—like where the training data came from or what biases it perpetuated—didn’t feel all that personal.
Now, less than two years laterBig tech companies are betting billions of dollars that we trust the security of these systems enough to hand over our personal information.. It is not yet clear whether we know enough to make this decision, and to what extent we can abstain even if we want to. “I’m concerned that this artificial intelligence arms race will lead to more and more of our data ending up in the hands of others,” Kahn says.
Apple will soon release beta versions of its Apple Intelligence features, starting this fall with the iPhone 15 and the new macOS Sequoia, which can run on Macs and iPads with M1 chips or later. Tim Cook, Apple’s CEO, says, “We believe Apple’s intelligence will be irreplaceable.”
Avec le Soleil qui se fait Discreet et le Mercure qui chute, on troque désormais…
Washington (EFE).- Republican Donald Trump, winner of the elections in the United States, announced this…
Similar news Rheumatoid arthritis and fibromyalgia are the two most common rheumatic diseases in the…
afp_tickers This content was published on November 8, 2024 - 01:59 The Federal Reserve on…
Infrared sensitivityLID-568 was discovered by an interinstitutional team of astronomers led by the Gemini International…
Madrid ranks 3-5 in the standings, and Armani is catching up at the bottom of…