'A Breach of Trust': Facebook Execs Admit Mistakes, Pledge More Security - NBC4 Washington
National & International News
The day’s top national and international news

'A Breach of Trust': Facebook Execs Admit Mistakes, Pledge More Security

Mark Zuckerberg acknowledged that there is more the company needs to do

    processing...

    NEWSLETTERS

    Facebook Shares Tumble Following Reports of Data Breach

    Facebook shares tumbled Monday following reports that user data had been inappropriately obtained. Cambridge Analytica, whose clients included Donald Trump's presidential campaign, reportedly used the data of 50 million Facebook users without their permission.

    (Published Monday, March 19, 2018)

    Breaking five days of silence, Facebook CEO Mark Zuckerberg apologized for a "major breach of trust," admitted mistakes and outlined steps to protect user data in light of a privacy scandal involving a Trump-connected data-mining firm.

    "I am really sorry that happened," Zuckerberg said of the scandal involving data mining firm Cambridge Analytica. Facebook has a "responsibility" to protect its users' data, he said in a Wednesday interview on CNN. If it fails, he said, "we don't deserve to have the opportunity serve people."

    His mea culpa on cable television came a few hours after he acknowledged his company's mistakes in a Facebook post, but without saying he was sorry.

    The company's second-in-command, Sheryl Sandberg, shared Zuckerberg's post and echoed his sentiment.

    'Late Night': Closer Look at the Alleged Omarosa Buy-Off

    [NATL] 'Late Night’: A Closer Look at Omarosa Claim on Attempt to Buy Silence

    Seth Meyers takes a closer look at President Donald Trump's former confidante Omarosa Manigault Newman releasing another secret tape as Trump's legal team threatened the special counsel in the Russia investigation.

    (Published Friday, Aug. 17, 2018)

    "This was a major violation of peoples' trust, and I deeply regret we didn't do enough to deal with it," she said.

    Zuckerberg and Sandberg had been quiet since news broke Friday that Cambridge Analytica may have used data improperly obtained from roughly 50 million Facebook users to try to sway elections.

    Facebook shares have dropped some 8 percent, lopping about $46 billion off the company's market value, since the revelations were first published.

    Even before the scandal broke, Facebook has already taken the most important steps to prevent a recurrence, Zuckerberg said. For example, in 2014, it reduced access outside apps had to user data. However, some of the measures didn't take effect until a year later, allowing Cambridge to access the data in the intervening months.

    Zuckerberg acknowledged that there is more to do.

    In a Facebook post on Wednesday, Zuckerberg said it will ban developers who don't agree to an audit. An app's developer will no longer have access to data from people who haven't used that app in three months. Data will also be generally limited to user names, profile photos and email, unless the developer signs a contract with Facebook and gets user approval.

    'Late Night’: A Closer Look at Omarosa Claim About Hacked Emails

    [NATL] 'Late Night’: A Closer Look at Omarosa Claim About Hacked Emails

    Seth Meyers takes a closer look at Omarosa Manigault Newman's claim that President Donald Trump knew in advance about the Democratic emails Russia hacked during the 2016 campaign.

    (Published Thursday, Aug. 16, 2018)

    In a separate post, Facebook said it will inform people whose data was misused by apps. Facebook first learned of this breach of privacy more than two years ago, but hadn't mentioned it publicly until Friday.

    The company said it was "building a way" for people to know if their data was accessed by "This Is Your Digital Life," the psychological-profiling quiz app that researcher Aleksandr Kogan created and paid about 270,000 people to take part in. Cambridge Analytica later obtained information from the app for about 50 million Facebook users, as the app also vacuumed up data on people's friends — including those who never downloaded the app or gave explicit consent.

    Chris Wylie, a Cambridge co-founder who left in 2014, has said one of the firm's goals was to influence people's perceptions by injecting content, some misleading or false, all around them. It's not clear whether Facebook would be able to tell users whether they had seen such content.

    Cambridge has shifted the blame to Kogan, which the firm described as a contractor. Kogan described himself as a scapegoat.

    Kogan, a psychology researcher at Cambridge University, told the BBC that both Facebook and Cambridge Analytica have tried to place the blame on him, even though the firm ensured him that everything he did was legal.

    "One of the great mistakes I did here was I just didn't ask enough questions," he said. "I had never done a commercial project. I didn't really have any reason to doubt their sincerity. That's certainly something I strongly regret now."

    Trump Revokes Brennan's Security Clearance

    [NATL] Trump Revokes Brennan's Security Clearance

    White House press secretary Sarah Huckabee Sanders announced Wednesday that President Donald Trump is revoking former CIA Director John Brennan's security clearance. Brennan has been a prominent critic of Trump's policies, words and actions.

    (Published Wednesday, Aug. 15, 2018)

    He said the firm paid some $800,000 for the work, but it went to participants in the survey.

    "My motivation was to get a dataset I could do research on," he said. "I have never profited from this in any way personally."

    Authorities in Britain and the United States are investigating.

    David Carroll, a professor at Parsons School of Design in New York who sued Cambridge Analytica in the U.K., said he was not satisfied with Zuckerberg's response, but acknowledged that "this is just the beginning."

    He said it was "insane" that Facebook had yet to take legal action against Cambridge parent SCL Group over the inappropriate data use. Carroll himself sued Cambridge Friday to recover data on him that the firm had obtained.

    Sandy Parakilas, who worked in data protection for Facebook in 2011 and 2012, told a U.K. parliamentary committee Wednesday that the company was vigilant about its network security but lax when it came to protecting users' data.

    WH: Cannot Guarantee Trump Didn't Use N-Word

    [NATL] WH Defends Trump's 'Dog' Comment, Says They Cannot Guarantee Trump Didn't Use N-Word

    The White House defended President Donald Trump calling former protégée Omarosa Manigault-Newman a "dog" in a Tuesday press conference. Press secretary Sarah Huckabee Sanders also could not guarantee that Trump has never used the N-word on record, but doubled down in his defense. 

    (Published Tuesday, Aug. 14, 2018)

    He said personal data including email addresses and in some cases private messages was allowed to leave Facebook servers with no real controls on how the data was used after that.

    "The real challenge here is that Facebook was allowing developers to access the data of people who hadn't explicitly authorized that," he said, adding that the company had "lost sight" of what developers did with the data.

    Read the statements Zuckerberg and Sandberg posted to Facebook below.

    MARK ZUCKERBERG:
    I want to share an update on the Cambridge Analytica situation -- including the steps we've already taken and our next steps to address this important issue.

    We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it.

    Here's a timeline of the events:

    Trump Tweets His Frustration at Omarosa, Calls Her a 'Dog'

    [NATL] Trump Tweets His Frustration at Omarosa's Press Tour, Calls Her a 'Dog'

    President Donald Trump is ramping up his war of words with reality star and former White House staffer Omarosa Manigault-Newman after she released secretly-recorded conversations between herself and the president.

    (Published Tuesday, Aug. 14, 2018)

    In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends' birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.

    In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends' data. Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends' data.

    In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan's could no longer ask for data about a person's friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan's from being able to access so much data today.

    In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people's consent, so we immediately banned Kogan's app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.

    Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We're also working with regulators as they investigate what happened.

    This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.

    'Late Night’: A Closer Look at Omarosa, Giuliani Causing Problems for Trump

    [NATL] 'Late Night’: A Closer Look at Omarosa, Giuliani Causing Problems for Trump

    Seth Meyers takes a closer look at President Donald Trump's staffing decisions coming back to haunt him as his lawyers fumble their response to the Russia probe and his White House feuds with Omarosa Manigault Newman.

    (Published Tuesday, Aug. 14, 2018)

    In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people's information in this way. But there's more we need to do and I'll outline those steps here:

    First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.

    Second, we will restrict developers' data access even further to prevent other kinds of abuse. For example, we will remove developers' access to your data if you haven't used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days.

    Third, we want to make sure you understand which apps you've allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you've used and an easy way to revoke those apps' permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.

    Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform.

    I started Facebook, and at the end of the day I'm responsible for what happens on our platform. I'm serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn't change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.

    How A New Chinese Policy is Causing a Recycling Nightmare in the US

    [NATL] How A New Chinese Policy is Causing a Recycling Nightmare in the US

    China has implemented a new recycling policy called National Sword, halting all imports of recycled materials. The U.S. was sending about 40 percent of its recycled goods to china before the policy's implementation which means there is a lot of recycled material with nowhere to go. See how the policy is affecting local recycling centers.

    (Published Monday, Aug. 13, 2018)

    I want to thank all of you who continue to believe in our mission and work to build this community together. I know it takes longer to fix all these issues than we'd like, but I promise you we'll work through this and build a better service over the long term.

    SHERYL SANDBERG:
    Sharing Mark's post addressing the Cambridge Analytica news. As he said, we know that this was a major violation of peoples' trust, and I deeply regret that we didn't do enough to deal with it. We have a responsibility to protect your data - and if we can't, then we don't deserve to serve you.

    We've spent the past few days working to get a fuller picture so we can stop this from happening again. Here are the steps we're taking. We're investigating all apps that had access to large amounts of information before we changed our platform in 2014 to dramatically reduce data access. And if we find that developers misused personally identifiable information, we'll ban them from our platform and we'll tell the people who were affected.

    We're also taking steps to reduce the data you give an app when you use Facebook login to your name, profile photo, and email address. And we'll make it easier for you to understand which apps you've allowed to access your data.

    You deserve to have your information protected - and we'll keep working to make sure you feel safe on Facebook. Your trust is at the core of our service. We know that and we will work to earn it.