Tim Cook implies that Facebook’s business model for maximizing engagement leads to polarization and violence

Apple CEO Tim Cook spoke today at the virtual conference Computers, privacy and data protection, condemning the business model of companies such as Facebook and highlighted Apple’s commitment to advancing user privacy .

“In a time of rampant misinformation and conspiracy theories played by algorithms, we can no longer keep our eyes open to a theory of technology that says every compromise is a good compromise (the longer the better) and all with the aim to collect as much data as possible, ”Cook said. “It has taken a long time to stop pretending that this approach carries no cost: polarization, lost trust, and, yes, violence,” he added.

Cook highlighted two recent privacy measures that Apple has taken, including privacy tags in the App Store and transparency of app tracking, which will require apps to request permission to track users from the upcoming iOS 14, iPadOS 14 and tvOS 14 versions. Apple says software updates will be released in early spring.

Today is Data Privacy Day and Apple has marked the occasion by sharing “A Day in the Life of Your Data,” an easy-to-understand PDF report that explains how third-party companies track data. users through websites and applications. the principles of privacy and provides more details on the transparency of application tracking.

Cook’s remarks can be heard in this YouTube video from the 3:50 mark:

Below is a complete transcript of the observations prepared by Cook.

Good afternoon.

John, thank you for the generous introduction and for welcoming us today.

It is a privilege to join you (and learn from this knowledgeable panel) on this appropriate occasion of Data Privacy Day.

Just over two years ago, together with my good friend, the much-missed Giovanni Buttarelli, and data protection regulators around the world, I spoke in Brussels about the emergence of an industrial data complex.

At that meeting we asked ourselves, “What kind of world do we want to live in?”

Two years later, we should now examine very carefully how we answered this question.

The fact is that an interconnected ecosystem of companies and data brokers, fake news providers and divisional street vendors, trackers and shit who only seek to earn quickly, is more present in our lives than ever before. never been.

And it has never been so clear how it first degrades our fundamental right to privacy and, consequently, our social fabric.

As I said before, “if we accept as normal and inevitable that everything in our lives can be added and sold, we lose much more than data. We lose the freedom to be human. ”

And yet, this is a hopeful new season. A time of reflection and reform. And the most concrete progress of all is thanks to many of you.

By misrepresenting cynics and perpetrators of the trial, the GDPR has provided an important basis for worldwide privacy rights and their implementation and enforcement must continue.

But we can’t stop here. We need to do more. And we’re already seeing hopeful steps around the world, including a successful voting initiative that strengthens consumer protection right here in California.

Together, we must send a universal and humanist response to those who claim a right to private information of users about what should not be tolerated and should not be tolerated.

As I said in Brussels two years ago, it is certainly time, not only for a comprehensive privacy law here in the United States, but also for global laws and new international agreements that enshrine the principles of data minimization, knowledge of user, user access and data security worldwide.

At Apple, spurred on by the leadership of many of you in the privacy community, it’s been two years of relentless action.

We have worked not only to delve into our own basic principles of privacy, but to create waves of positive change across the industry.

We’ve talked time and time again about strong encryption with no back doors, recognizing that security is the foundation of privacy.

We’ve set new industry standards for data minimization, user control, and device processing, from location data to your contacts and photos.

While we’ve led the functions that keep you healthy and wholesome, we’ve made sure technologies like a blood oxygen sensor and an ECG provide the peace of mind that your health data is yours.

And last but not least, we are deploying new and powerful requirements to advance user privacy across the App Store ecosystem.

The first is a simple but revolutionary idea that we call privacy nutrition label.

Each app, including ours, must share its data collection and privacy practices, information that the App Store presents in a way that all users can understand and act on.

The second is called Application Tracking Transparency. At its inception, ATT seeks to return control to users, to give them their opinion on how their data is managed.

Users have been asking for this feature for a long time. We have worked closely with developers to give them the time and resources to implement it. And we are passionate because we think it has great potential to make things better for everyone.

Because ATT answers a very real problem.

Earlier today we launched a new article called “A Day in the Life of Your Data”. Explain how the apps we use every day contain an average of six crawlers. This code often exists to monitor and identify users of all applications. , viewing and recording their behavior.

In this case, what the user sees is not always what he achieves.

At this point, users may not know if the apps they use to pass the time, to check with their friends, or to find a place to eat may in fact be transmitting information about the photos they have taken, people on your contact list or location data that reflect where they eat, sleep, or pray.

As the document shows, it seems like no information is too private or personal to be monitored, monetized, and aggregated into a 360-degree view of your life. The end result of all this is that you are no longer the customer, you are the product.

When ATT is in full effect, users will be able to express their opinion on this type of tracking.

Some may think that sharing this degree of information is worthwhile for more targeted ads. Many others, I suspect, won’t, as most appreciated when we incorporated functionality similar to web crawlers that limited Safari several years ago.

We see the development of such privacy-focused functions and innovations as a fundamental responsibility of our work. We always have it, we always will.

The fact is that the ATT debate is a microcosm of a debate we have had for a long time, where our point of view is very clear.

Technology does not need large amounts of personal data, combined with dozens of websites and applications, to be successful. Advertising existed and thrived for decades without it. And we are here because the path of least resistance is seldom the path of wisdom.

If a company relies on misleading users, on data exploitation, on options that are no choice, it does not deserve our praise. It deserves reform.

We should not look away from the big picture.

In a time of rampant misinformation and conspiracy theories played by algorithms, we can no longer keep our eyes open to a theory of technology that says every compromise is a good compromise (the longer the better) and all with the goal to collect as much data as possible.

There are too many people who still ask themselves the question, “How much can we get out of it?”, When they need to ask themselves “what are the consequences?”

What are the consequences of prioritizing conspiracy theories and violent incitement simply because of their high engagement rates?

What are the consequences of not only tolerating rewarding content, but undermining public confidence in life-saving vaccines?

What are the consequences of seeing thousands of users join extremist groups and therefore perpetuate an algorithm that recommends even more?

It has taken a long time to stop pretending that this approach carries no cost: polarization, lost trust, and, yes, violence.

A social dilemma cannot be allowed to become a social catastrophe.

I think last year, and certainly recent events, have brought the risk of that for all of us, as a society and as individuals, just like anything else.

The long hours spent at home, the challenge of keeping children learning when schools are closed, the worry and uncertainty about what the future would hold, all caused a great relief from how technology can help and how it can be use. hurt.

Does the future belong to the innovations that make our lives better, fuller and more humane?

Or will it belong to those tools that draw our attention to the exclusion of everything else, that aggravate our fears and group extremism, to run ads aimed more and more invasively at all other ambitions?

At Apple, we chose a long time ago.

We believe that ethical technology is the technology that works for you. It’s a technology that helps you sleep and doesn’t keep you standing. This tells you when you’ve had enough, that it gives you room to create, draw, write, or learn and not just cool off once again. It’s a technology that can fade into the background when you’re hiking or going for a swim, but it’s there to warn you when your heart rate is up or to help you when you’ve had a nasty fall. And all this, always, puts privacy and security first, because no one has to change the rights of their users to offer a fantastic product.

Tell us naively. But we still believe that technology made by people, for people and taking into account the well-being of people, is too valuable a tool to abandon. We still believe that the best measure of technology is life improving.

We are not perfect. We will make mistakes. This is what makes us human. But our commitment to you, now and always, is that we will maintain faith in the values ​​that have inspired our products from the beginning. Because what we share with the world is nothing without the trust that our users have in it.

To all of you who have joined us today, please continue to push us all. Continue to set high standards that prioritize privacy AND take new and necessary steps to reform what is broken.

We have moved forward together and we need to do more. Because it is always the right time to be bold and brave in the service of a world where, as Giovanni Buttarelli said, technology serves people and not the other way around.

Thank you so much.

Note: Due to the political or social nature of the discussion on this topic, the discussion thread is on our political news forum. All forum members and site visitors are welcome to read and follow the thread, but the post is limited to forum members with at least 100 posts.

.Source