THE FBI WANTED A BACKDOOR TO THE IPHONE. TIM COOK SAID NO

IN 2016, TIM Cook fought the law—and won.

Late in the afternoon of Tuesday, February 16, 2016, Cook and several lieutenants gathered in the “junior boardroom” on the executive floor at One Infinite Loop, Apple’s old headquarters. The company had just received a writ from a US magistrate ordering it to make specialized software that would allow the FBI to unlock an iPhone used by Syed Farook, a suspect in the San Bernardino shooting in December 2015 that left 14 people dead.

The iPhone was locked with a four-digit passcode that the FBI had been unable to crack. The FBI wanted Apple to create a special version of iOS that would accept an unlimited combination of passwords electronically, until the right one was found. The new iOS could be side-loaded onto the iPhone, leaving the data intact.

But Apple had refused. Cook and his team were convinced that a new unlocked version of iOS would be very, very dangerous. It could be misused, leaked, or stolen, and once in the wild, it could never be retrieved. It could potentially undermine the security of hundreds of millions of Apple users.

In the boardroom, Cook and his team went through the writ line by line. They needed to decide what Apple’s legal position was going to be and figure out how long they had to respond. It was a stressful, high-stakes meeting. Apple was given no warning about the writ, even though Cook, Apple’s top lawyer, Bruce Sewell, and others had been actively speaking about the case to law enforcement for weeks.

The writ “was not a simple request for assistance in a criminal case,” explained Sewell. “It was a forty-two-page pleading by the government that started out with this litany of the horrible things that had been done in San Bernardino. And then this . . . somewhat biased litany of all the times that Apple had said no to what were portrayed as very reasonable requests. So this was what, in the law, we call a speaking complaint. It was meant to from day one tell a story . . . that would get the public against Apple.”

The team came to the conclusion that the judge’s order was a PR move—a very public arm twisting to pressure Apple into complying with the FBI’s demands—and that it could be serious trouble for the company. Apple “is a famous, incredibly powerful consumer brand and we are going to be standing up against the FBI and saying in effect, ‘No, we’re not going to give you the thing that you’re looking for to try to deal with this terrorist threat,’” said Sewell.

They knew that they had to respond immediately. The writ would dominate the next day’s news, and Apple had to have a response. “Tim knew that this was a massive decision on his part,” Sewell said. It was a big moment, “a bet-the-company kind of decision.” Cook and the team stayed up all night—a straight 16 hours—working on their response. Cook already knew his position—Apple would refuse—but he wanted to know all the angles: What was Apple’s legal position? What was its legal obligation? Was this the right response? How should it sound? How should it read? What was the right tone?

Read More

Tracking Phones, Google Is a Dragnet for the Police

When detectives in a Phoenix suburb arrested a warehouse worker in a murder investigation last December, they credited a new technique with breaking open the case after other leads went cold.

The police told the suspect, Jorge Molina, they had data tracking his phone to the site where a man was shot nine months earlier. They had made the discovery after obtaining a search warrant that required Google to provide information on all devices it recorded near the killing, potentially capturing the whereabouts of anyone in the area.

Investigators also had other circumstantial evidence, including security video of someone firing a gun from a white Honda Civic, the same model that Mr. Molina owned, though they could not see the license plate or attacker.

But after he spent nearly a week in jail, the case against Mr. Molina fell apart as investigators learned new information and released him. Last month, the police arrested another man: his mother’s ex-boyfriend, who had sometimes used Mr. Molina’s car.

The warrants, which draw on an enormous Google database employees call Sensorvault, turn the business of tracking cellphone users’ locations into a digital dragnet for law enforcement.

The Arizona case demonstrates the promise and perils of the new investigative technique, whose use has risen sharply in the past six months, according to Google employees familiar with the requests. It can help solve crimes. But it can also snare innocent people.

Technology companies have for years responded to court orders for specific users’ information. The new warrants go further, suggesting possible suspects and witnesses in the absence of other clues. Often, Google employees said, the company responds to a single warrant with location information on dozens or hundreds of devices.

Law enforcement officials described the method as exciting, but cautioned that it was just one tool.

Read More

Evaluating the Use of Automated Facial Recognition Technology in Major Policing Operations

Academics at Cardiff University have conducted the first independent academic evaluation of Automated Facial Recognition (AFR) technology across a variety of major policing operations.

The project by the Universities’ Police Science Institute evaluated South Wales Police’s deployment of Automated Facial Recognition across several major sporting and entertainment events in Cardiff city over more than a year, including the UEFA Champion’s League Final and the Autumn Rugby Internationals.

The study found that while AFR can enable police to identify persons of interest and suspects where they would probably not otherwise have been able to do so, considerable investment and changes to police operating procedures are required to generate consistent results.

Researchers employed a number of research methods to develop a rich picture and systematically evaluate the use of AFR by police across multiple operational settings. This is important as previous research on the use of AFR technologies has tended to be conducted in controlled conditions. Using it on the streets and to support ongoing criminal investigations introduces a range of factors impacting the effectiveness of AFR in supporting police work.

The technology works in two modes: Locate is the live, real-time application that scans faces within CCTV feeds in an area. It searches for possible matches against a pre-selected database of facial images of individuals deemed to be persons of interest by the police.

Identify, on the other hand, takes still images of unidentified persons (usually captured via CCTV or mobile phone camera) and compares these against the police custody database in an effort to generate investigative leads. Evidence from the research found that in 68 percent of submissions made by police officers in the Identify mode, the image was not of sufficient quality for the system to work.

Over the period of the evaluation, however, the accuracy of the technology improved significantly and police got better at using it. The Locate system was able to correctly identify a person of interest around 76 percent of the time. A total of 18 arrests were made in ‘live Locate deployments during the evaluation, and in excess of 100 people were charged following investigative searches during the first 8-9 months of the AFR Identify operation (end of July 2017-March 2018).

The report suggests that it is more helpful to think of AFR in policing as ‘Assisted Facial Recognition’ rather than a fully ‘Automated Facial Recognition’ system. ‘Automated’ implies that the identification process is conducted solely by an algorithm, when in fact, the system serves as a decision-support tool to assist human operators in making identifications. Ultimately, decisions about whether a person of interest and an image match are made by police operators. It is also deployed in uncontrolled environments, and so is impacted by external factors including lighting, weather and crowd flows.

“There is increasing public and political awareness of the pressures that the police are under to try and prevent and solve crime. Technologies such as Automated Facial Recognition are being proposed as having an important role to play in these efforts. What we have tried to do with this research is provide an evidence-based and balanced account of the benefits, costs and challenges associated with integrating AFR into day-to-day policing,” says Professor Martin Innes, director, Crime and Security Research Institute and Director, Universities’ Police Science Institute.

Read More

Federal Partnerships Key to Dismantling Online Drug Markets

When Knoxville first responders found a man dead in his home, there was clear evidence on the scene of the heroin that caused his overdose. Also nearby were clues to how the deadly drugs had reached him. Investigators found a padded manila envelope with postage and markings that provided them another link back to the online drug sellers who have proliferated on the Darknet in recent years.

Drug traffickers are increasingly using anonymous online networks to sell narcotics, including potent synthetic opioids like fentanyl, to buyers who can order and receive the drugs without ever leaving home. What can appear to be a regular e-commerce transaction is one of the delivery channels fueling a deadly nationwide epidemic. The Centers for Disease Control and Prevention reports that drug overdose deaths have been on an upward climb for several years, across the United States and across all demographic groups. In 2017 alone, 70,237 people in this country died of a drug overdose; two-thirds of those deaths involved an opioid.

As part of a government-wide effort to address the epidemic, the Department of Justice created the Joint Criminal Opioid and Darknet Enforcement (J-CODE) team in 2018 to leverage the power of federal and international partnerships to combat the complex and deadly threat of online drug sales.

Now in its second year, J-CODE is delivering results through coordinated efforts and the commitment of the nation’s law enforcement agencies to address opioid sales on the Darknet. Building on the success of last year’s Operation Disarray, the J-CODE team led Operation SaboTor between January and March of this year. These concentrated operations in the United States and abroad led to 61 arrests and shut down 50 Darknet accounts used for illegal activity. Agents executed 65 search warrants, seizing more than 299 kilograms of drugs, 51 firearms, and more than $7 million ($4.504 million in cryptocurrency, $2.485 million in cash, and $40,000 in gold).

Read More