Tag: Privacy

Phone texts don’t die: they hide

The computer forensics expert who recovered the text messages that brought down parliamentary Speaker Peter Slipper has warned that any messages or files you think you have deleted from your smartphone are still there if someone really wants to find them.

The national head of the IT forensics practice at corporate advisory firm PPB Advisory, Rod McKemmish, was brought in by the legal team of Mr Slipper’s former staffer James Ashby, as some of the messages he had received from the former speaker had been deleted.

He was able to use an automated forensic process to bring the messages back from the dead.

“The delete button on the phone should really be called the ‘hide’ button, because the data is still there, you just can’t see it,” Mr McKemmish said. “In the forensic process we can bring it all back.”

While most politicians and business people are unlikely to be communicating about the sort of topics that brought down Mr Slipper, many might rethink the privacy of their communications.

With soaring levels of smartphone penetration in Australia, it is fair to assume that a significant amount of sensitive discussions take place via SMS.

Mr McKemmish said his skills were increasingly being called upon to investigate corporate cases, where firms were concerned about confidential information residing on the phones of staff leaving. Most phones have a “factory reset feature”, which is supposed to revert the phone to the state when it was first used, but it’s insufficient.

IBRS technology analyst James Turner said businesses needed to be more alert to the permanent nature of digital communication, as more important conversations were handled by SMS and email.

“This can be share price-impacting information, because deals can be made via an SMS that are worth a lot of money,” he said. “The audit trail is all important when it comes to being able to report that due process has been followed, so i f people are using electronic communications, then they must expect that there is a record.”

Not all communication via SMS or email is related to big deals of course. Much could be slotted into the files marked “harmless banter” or “office gossiping”. Common stuff, but not necessarily words that people want to be accessible once the messages have been deleted.

Unfortunately for regular texters,cA computer forensics expert and adjunct professor at Queensland University of Technology, Bradley Schatz, says smartphones were designed to hold on to data as a guard against accidental loss.

He says there are a number of factors that will govern how long a message exists on a phone after it has supposedly been deleted, but a basic guide is that it will remain somewhere on the phone until all available space for new data has been exhausted.

“The memory inside many of these small-scale digital devices is called flash memory, which is the same kind of memory that you would find in a USB key,” Schatz said.

Read More

If you’re a Comcast cable customer, your home’s private Wi-Fi router is being turned into a public hotspot.

It’s been one year since Comcast (CMCSA) started its monster project to blanket residential and commercial areas with continuous Wi-Fi coverage. Imagine waves of wireless Internet emitting from every home, business and public waiting area.

Comcast has been swapping out customers’ old routers with new ones capable of doubling as public hotspots. So far, the company has turned 3 million home devices into public ones. By year’s end it plans to activate that feature on the other 5 million already installed.

Anyone with an Xfinity account can register their devices (laptop, tablet, phone) and the public network will always keep them registered — at a friend’s home, coffee shop or bus stop. No more asking for your cousin’s Wi-Fi network password.

But what about privacy? It seems like Comcast did this the right way.t’s potentially creepy and annoying. But the upside is Internet everywhere.

Outsiders never get access to your private, password-protected home network. Each box has two separate antennae, Comcast explained. That means criminals can’t jump from the public channel into your network and spy on you.

And don’t expect every passing stranger to get access. The Wi-Fi signal is no stronger than it is now, so anyone camped in your front yard will have a difficult time tapping into the public network. This system was meant for guests at home, not on the street.

As for strangers tapping your router for illegal activity: Comcast said you’ll be guilt-free if the FBI comes knocking. Anyone hooking up to the “Xfinity Wi-Fi” public network must sign in with their own traceable, Comcast customer credentials.

Still, no system is foolproof, and this could be unnecessary exposure to potential harm. Craig Young, a computer security researcher at Tripwire, has tested the top 50 routers on the market right now. He found that two-thirds of them have serious weaknesses. If a hacker finds one in this Comcast box, all bets are off.

“If you’re opening up another access point, it increases the likelihood that someone can tamper with your router,” he said.

Read More

But this time he’s wearing Google Glass — and he’s after your iPad PIN.

Cyber forensics experts at the University of Massachusetts in Lowell have developed a way to steal passwords entered on a smartphone or tablet using video from Google’s face-mounted gadget and other video-capturing devices. The thief can be nearly ten feet away and doesn’t even need to be able to read the screen — meaning glare is not an antidote.

The security researchers created software that maps the shadows from fingertips typing on a tablet or smartphone. Their algorithm then converts those touch points into the actual keys they were touching, enabling the researchers to crack the passcode.

They tested the algorithm on passwords entered on an Apple (AAPL, Tech30) iPad, Google’s (GOOGL, Tech30) Nexus 7 tablet, and an iPhone 5.

Why should you be worried?

“We could get your bank account password,” researcher Xinwen Fu said.

The software can be applied to video taken on a variety of devices: Fu and his team experimented with Google Glass, cell phone video, a webcam and a camcorder. The software worked on camcorder video taken at a distance of over 140 feet.

Of course, pointing a camcorder in a stranger’s face might yield some suspicion. The rise of wearable technology is what makes this approach actually viable. For example, a smartwatch could stealthily record a target typing on his phone at a coffee shop without drawing much attention.

Fu says Google Glass is a game-changer for this kind of vulnerability.

“The major thing here is the angle. To make this attack successful the attacker must be able to adjust the angle to take a better video … they see your finger, the password is stolen,” Fu said.

Google says that it designed Glass with privacy in mind, and it gives clear signals when it is being used to capture video.

“Unfortunately, stealing passwords by watching people as they type them into ATMs and laptops is nothing new,” said Google spokesman Chris Dale. “The fact that Glass is worn above the eyes and the screen lights up whenever it’s activated clearly signals it’s in use and makes it a fairly lousy surveillance device.”

Read More

Using a simple camera or camera phone, it is now possible to snap a photo of almost any key ring and use the image to make a physical copy of a key.

The I-Team did just that, using a web site called keysduplicated.com.

With a smartphone camera, the I-Team took a photo of a key ring lying unarranged on a desk. After the image was run though photo-editing software, and one house key was isolated, the image was uploaded to the web site. Within a week, the company sent the I-Team a copy of the key,which opened the front door of a home.

In crowded Bryant Park one recent day, Devon White’s key ring was one of several the I-Team found sitting in plain sight, vulnerable to a camera phone snapshot.

“It’s always cool when new technology moves in a new direction, but it is a bit worrying,” said White, of Queens, after learning about this new vulnerability. “You wonder … anybody could just take a picture of anybody else’s key.”

Police in Nassau County said they first became aware of the new key-cutting technology in recent months. They said they have not linked any crimes to it yet, but they are urging people to use caution in how they handle their keys.

“All it takes to cut a key is you just have to have the outline of the key,” said Detective Sgt. Richard Harasym, who heads Nassau’s Crimes Against Property Unit. “It’s unlocking the keys to your castle, so to speak, and if you leave them out there, then you run the risk that something bad could happen.”

The I-Team contacted Ali Rahimi, the founder of the web site. Rahimi admitted News 4′s demonstration reveals a security risk, and said he will look for ways to close it. Still, but he said he’s unaware of any customer who’s ever used one of his keys illegally.

“It’s worth incorporating the lessons we’ve learned from your experiment. It’ll take some thought,” said Rahimi.

Rahimi said one possibility might be to use software that detects when photos have been doctored. The web site does require a credit card for payment. Until better security checks can be implemented, he said his employees will screen for any suspicious-looking pictures, and ask for additional photos of the key in the customer’s hands.

“That’ll ensure they have physical access to the key,” he said.

View Source

Google is already receiving demands from people to remove links from its search results just days after Europe’s highest court said people worried about their privacy have the “right to be forgotten” on the Internet.

The European Court of Justice on Tuesday found Google and other search engines control information and are responsible for removing unwanted links if requested. In the ruling, the court decided that Google results linking to a newspaper’s notice about a Spanish man’s social security debts in 1998 were no longer relevant and must be deleted.

Google can, however, decline requests the company believes are in the public interest to remain in its search results.

Google declined to say how many people have requested information to be taken down as a result of the ruling. But some of the people who have requested that Google remove unsavory Web pages about them demonstrate the murky situation Google finds itself in: A politician, a poorly reviewed doctor and a pedophile are among the first to have issued take-down requests.

A person with knowledge of the requests said a man convicted of possession of child pornography has requested that Google (GOOG, Fortune 500) remove links to Web pages about his conviction. A former politician has also requested that the search engine remove links to a news article about his behavior while he was holding office. And a physician has requested that links to a review site be removed.

Google has not yet taken the links down. The company said it first needs to develop a procedure to handle a potential flood of requests for removal.

“The ruling has significant implications for how we handle take-down requests,” a Google spokesman said. “This is logistically complicated — not least because of the many languages involved and the need for careful review. As soon as we have thought through exactly how this will work, which may take several weeks, we will let our users know.”

Google is used to handling take-down requests. The search engine said it received more than 25 million requests from companies claiming Google results linked to material that infringes on copyrights. Google also receives thousands of requests from governments to take down links to websites that violate laws. Google complies with fewer than half of the government take-down requests but does not specify its compliance rate for copyright-related requests.

But copyright and many other laws are considerably clearer-cut that the test of “relevance to public interest,” which Google will now need to abide by in the European Union.

Read More

Once Searched, Forever Seized?

How private is the data on your cell phone? That was the big question before the Supreme Court last week in a pair of cases, Riley v. California and United States v. Wurie, with the potential for huge consequences for the future of information privacy.

The cases involve a longstanding exception to the Fourth Amendment that permits the police to search items on or near someone they have arrested, no warrant required. The rule was intended to keep officers safe and prevent the destruction of evidence. In recent years, however, the rule has given police free rein to seize and search the devices that store our calls, text messages, e-mails, and troves of other personal data such as our financial history, medical information, and daily movements.

Many of the Justices expressed concern over the disproportionate invasion of privacy, suggesting that a warrant should be required for a cell phone search. But there was another question that caught the Court’s attention: What happens to all that data once the police have it?

It is common practice to copy the contents of device before searching it. But if the police don’t need a warrant to do that, then there is also no judicial check on what happens to that information, how it’s used, or who gets to see it. A warrant requirement would serve two purposes. It would be a bulwark against highly invasive fishing expeditions resembling the “general warrants” so abhorred by the nation’s Founding Fathers. And it would provide a way to limit the sensitive information that the government is allowed to keep and share about you.

The retention of cell phone data raises extraordinary privacy concerns above and beyond whatever visual inspection a police officer might need to conduct on the spot. In the Riley case, for example, a San Diego detective admitted to downloading “a lot of stuff” from the cell phone at a regional computer forensics lab run by the FBI. The lab gave local police access to sophisticated forensics technology capable of making mirror copies of data stored on electronic devices.

An amicus brief filed by the Brennan Center for Justice points out that this kind of sophisticated data extraction is not unusual. It is a law enforcement tactic that has become increasingly popular around the country. Since 1999, the FBI has partnered with local law enforcement agencies to establish a network of forensic computer labs in 19 states. When it comes to cell phone data, these laboratories provide local police access to “Cell Phone Investigative Kiosks,” which allow officers to “extract data from a cell phone, put it into a report, and burn the report to a CD or DVD in as little as 30 minutes.” In other words, the police can pull data off your cell phone about your ‘whole life’ in the time it takes you to upload pictures of your vacation to Facebook.

Read More

LOS ANGELES — Officers at thousands of law enforcement agencies are wearing tiny cameras to record their interactions with the public, but in many cases the devices are being rolled out faster than departments are able to create policies to govern their use.

And some rank-and-file officers are worried the technology might ultimately be used to derail their careers if, for example, an errant comment about a superior is captured on tape.

Most law enforcement leaders and civil liberties advocates believe the cameras will ultimately help officers because the devices give them a way to record events from their point of view at a time when citizens armed with cellphones are actively scrutinizing their every move.

They say, however, that the lack of clear guidelines on the cameras’ use could potentially undermine departments’ goals of creating greater accountability of officers and jeopardize the privacy of both the public and law enforcement officers.

“This is a brave new world that we’re entering here, where citizens and police both are going to be filming each other,” said Chuck Wexler, the executive director of the Police Executive Research Forum, a nonprofit police research and policy organization.

The U.S. Justice Department has asked Wexler’s group to help develop guidelines for the cameras’ use, from when the devices should be turned on to how departments can protect the privacy of those who are inadvertently captured on the footage.

Equipping police with cameras isn’t a new concept. For decades police have used cameras mounted to the dashboards of their patrol cars — initially referred to with suspicion by officers as “indict-o-cams” until they discovered the footage exonerated them in most cases.

As camera technology and data storage has become more affordable and reliable, the use of portable cameras has increased over the last five years. Now officers in one of every six departments are patrolling with them on their chests, lapels or sunglasses, according to Scott Greenwood, general counsel for the national American Civil Liberties Union and an expert on the cameras.

With the push of a finger, officers can show the dangers and difficulties of their work. Unlike dashboard cameras, body cameras follow the officer everywhere — when their cruiser stays parked at the curb, when they go into homes on search warrants or when they are running after a suspect.

The cameras, if they aren’t turned off, can go with officers into a bathroom or locker room, or capture private conversations between partners. Footage can become evidence in a criminal case, or be used to discipline officers or exonerate them of false accusations.

Without strong policies, experts say, departments could lose the public’s trust. The public needs to know cameras aren’t only being turned on when it’ll help officers. But there are certain moments such as during the interview of a sexual assault victim or talk with a confidential informant when filming may be sensitive or even compromise a case, said Bay Area attorney Mike Rains, whose firm often represents officers and has worked on body camera policies with departments.

The Los Angeles Police Department is now field testing cameras with an eye toward ultimately deploying them to all patrol officers — a move that would make its program the nation’s largest. For the six months of the test, underway now, there will be no official policy. Department officials say a policy will be created with input from the community and union, when they know more about how the cameras work in the field.

Union chief Tyler Izen, who represents more than 9,900 sworn officers, said that while there’ve been no complaints so far, the strategy is risky and could be problematic for his officers as well as the public, which has become an involuntary guinea pig in the trial. “They’re basically taking their chances,” Izen said.

There’s still very little research into the impacts of these cameras on policing and their ripple effects on the criminal justice system, said Justin Ready, assistant professor at Arizona State’s department of criminology and criminal justice. But more studies are underway, including two that Ready is involved in.

The police department in Rialto, Calif., concluded a yearlong University of Cambridge study last year that found an 89 percent drop in complaints against officers during the camera trial. The chief has since mandated its deployment to its roughly 90 sworn officers.

Rialto police Sgt. Richard Royce said he was exonerated by the footage during the study.

“I’d rather have my version of that incident captured on high-definition video in its entirety from my point of view, then to look at somebody’s grainy cellphone camera footage captured a 100 feet away that gets cropped, edited, changed or manipulated,” Royce said.

Greenwood of the ACLU said he’s provided input in drawing up the Justice Department guidelines. He said the proposed policy is pretty good, but gives officers more discretion than is wise.

“It’s a far better policy decision to mandate the encounter be recorded and deal with the unwanted video,” Greenwood said. Because if a situation goes bad quickly and there’s no footage, the officer is in trouble, Greenwood said.

Captured video could protect the department — and ultimately the taxpayer— from a false claim and expensive litigation or result in disciplining a problem officer.

One case, also in Oakland, is being used to educate officers in California about the technology. An officer chasing a suspect said he saw the suspect with a gun in his hand before fatally shooting him three times in ¾ of a second. A gun was later found in the grass.

It cost the city $10,000 to have roughly 15 seconds of video analyzed by an expert, and because of the angle of where the camera was placed — on the officer’s chest — no gun was seen in the suspect’s hand on film, said Rains, an attorney whose firm represented the officer.

Sgt. Barry Donelan, the police union chief in Oakland, said the department initially moved to terminate the officer for an excessive response, but he was ultimately exonerated because the video analysis backed up the officer’s account.

Donelan said the danger with such footage is it taps into a human tendency to over rely on video at the expense of other accounts of an event, and can be especially problematic in high-adrenaline situations.

When that happens, “it’s just about the camera,” Donelan said. “It’s the ultimate Monday morning quarterbacking tool.”

View Source

Police body cameras spur privacy debate

The woman says she doesn’t know why she’s being pulled over, but it’s obvious: she’s driving on the wrong side of the road. And when a police officer asks the woman to get out of the car, she rams ahead before crashing into a pole and taking off on foot. She’s stopped with a stun gun and handcuffed.

You can watch the whole thing on YouTube, thanks to the Laurel Police Department’s decision to outfit its officers with what the police call “lipstick cameras.”

Enclosed in a slim piece of black plastic, the recorder attaches to a pair of sunglasses or a headband. The city started using the device six months ago. Since then, Chief Rich McLaughlin says, complaints against officers have gone down and so has the use of police force.

“It keeps everybody in check, on both sides,” he said.

Although dashboard cameras in police cruisers are ubiquitous, they provide a limited vantage point and capture mostly traffic stops. On-body cameras that take in everything an officer sees have started to gain traction nationwide; one recently captured the police shooting of a former New York Giants player in Daytona Beach, Fla. Laurel is one of the first departments in the Washington region to adopt them, along with Cheverly and New Carrollton. Hyattsville and the University of Maryland plan to start using them soon.

Police say the videos can provide valuable evidence in court and a clear record of the actions of officers. But questions remain about use of the cameras — precisely when they should be turned on and off — and what becomes of the countless hours of video footage. Some officials also worry that the cameras will discourage some people from approaching officers with tips or concerns.

“This is a discussion that’s bigger than just whether cameras work or not,” said Baltimore City Fraternal Order of Police President Bob Cherry. Next, he suggested, could come cameras on public school teachers and medical professionals. “How far do we want to go as a society, in terms of recording everything?”

The American Civil Liberties Union, which generally is wary of surveillance, recently expressed support for the cameras. But the organization acknowledges the privacy concerns of the police and the public, and its support comes with conditions.

“I absolutely know this tool will transform policing,” Scott Greenwood, a police accountability attorney and general counsel for the ACLU, said in an interview. “It’s an unalloyed good, provided that policies are in place that mandate the use of devices rather than leaving it up to the discretion of the officers.”

The ACLU calls for consent from a filmed citizen when releasing videos — a policy that could have barred Laurel’s car chase from YouTube — and redaction where feasible. Body camera makers are only just starting to catch up with that demand.

The video is “really helpful, but it also raises concerns,” said police Sgt. Rob Drager of Albuquerque, one of the first departments to use the body cameras. Under state information request laws, he said, his department once released a tape to a local news station that included unedited video of officers responding as a child was being strangled.

“We encounter people on the worst day of their lives, and now that’s public record, that’s out there,” he said.

There are also questions about what ultimately happens to the video — even scenes of mundane interaction between police and citizens.

“Who owns the data?” asked Steven Edwards, a senior policy adviser at the Justice Department, during a recent conference on body cameras organized by the Police Executive Research Forum, a national membership group. “Five years from now, how will this data be used?”

In New York, a judge has ordered some police to wear cameras as part of the ruling on the city’s “stop and frisk” policy. Los Angeles has decided to use the cameras — if the city can pay for them. D.C. police spokeswoman Gwendolyn Crump said city officers don’t wear cameras but “technology is constantly evolving and we will keep the possibility open.”

More than 100 small police departments in Virginia, including the cities of Fairfax and Falls Church, have been given body cameras by VML Insurance Programs.

But Lt. John Bisek of the Manassas Police Department said those cameras were more of a conversation-starter than a solution. “The quality isn’t there,” he said. More expensive cameras have better security measures to prevent tampering and require less upkeep, he said, but Manassas doesn’t have the funding.

Most of the cameras cost hundreds of dollars; data storage is an added cost. Laurel uses Taser, which sells a high-end camera and data system..

When they were first told they had to film every encounter, some officers in Laurel were not thrilled, McLaughlin said. But now they come to him asking for the cameras. He just ordered a new batch, and now nearly all 70 officers have them.

Officers from nearby cities “ask, ‘Oh, how do you like Big Brother?’” said Officer Matt Jordan. “But I don’t have a problem with it. I like it.”

The camera helped clear him after a citizen complaint, Jordan said. Once, it defused a confrontation outside a bar: “As soon as they saw the cameras, they left.” In court cases, they’ve been used to secure a drug-related guilty plea and prove that an officer was shoved.

A 12-month Police Foundation experiment in Rialto, Calif., found similar results, according to a report from the city’s police chief. Officer force was used 2.5 times more when officers were not wearing cameras. There were 28 citizen complaints the year before; during the experiment, there were three.

Unlike the District and Virginia, in Maryland taping a private conversation requires the consent of both parties. But state courts have concluded that public stops are not private. Laurel police only offer to turn the camera off when they go into a home; they also turn them off in hospitals.

“I think everyone’s sort of waiting to see how the courts going to accept some of these things,” said Cheverly Police Chief Joseph Frohlich. “I don’t think anyone knows what the limits are.” His officers also turn cameras off in private contexts.

Laurel officers can refer to the videos — automatically downloaded to a smartphone app — to write reports during each shift. At the end of the day, the cameras go into a docking station and the clips are automatically transferred to a data-storage Web site.

Unless the police mark a recording as evidence, it’s destroyed in 180 days. Officers can also ask for a copy to be burned onto a CD for use in court. Otherwise, they can’t touch them.

At the recent law enforcement conference, several chiefs said that in a world where people regularly use cellphones to film officers and post the choppy clips online, police need to be able to produce their own video.

“Everybody’s filming everybody,” said Philadelphia Police Commissioner Charles Ramsey, who led D.C. police from 1998 to 2006. “It’s the reality of the world we’re in; we can’t ignore it. We’ve just got to figure out a way to do it in a constitutional fashion.”

View Source

National Security Agency snoops are harvesting as many as 5 billion records daily to track mobile phones as they ping nearby cell towers across the globe.

That alarming scoop by The Washington Post via documents provided by NSA leaker Edward Snowden included wishful thinking from an unnamed government “intelligence lawyer” interviewed in the story. This official, according to the Post, said that the data “are not covered by the Fourth Amendment,” meaning a probable-cause warrant isn’t required to get it.

In reality, however, the case law on cell-site locational tracking — while generally favorable to the government — is far from clear, with federal courts and appellate courts offering mixed rulings on whether warrants are needed.

And it’s a big deal. As of last year, there were 326.4 million wireless subscriber accounts, exceeding the U.S. population, responsible for 2.3 trillion annual minutes of calls, according to the Wireless Association.

All the while, warrantless cell-phone location tracking has become a de facto method to snoop on criminals in the wake of the Supreme Court’s decision that probable-cause warrants from judges are generally needed to affix covert GPS devices to vehicles.

Yet the mobile-phone location data issue has never been squarely addressed by the Supreme Court, and the dispute isn’t likely to be heard by the justices any time soon. All of which means that the legality of the latest crime- or terror-fighting method of choice is equally up in the air.

The high court in June rejected an appeal (.pdf) from a drug courier sentenced to 20 years after being nabbed with 1,100 pounds of marijuana in a motor home camper the authorities tracked via his mobile phone pinging cell towers for three days from Arizona to a Texas truck stop.

In that case, and without comment, the Supreme Court let stand a ruling from the 6th U.S. Circuit Court of Appeals, which covers Kentucky, Michigan, Ohio and Tennessee. The appeals court ruled that probable-cause warrants were not necessary to obtain cell-site data.

The appeals court had distinguished the case from the GPS decision decided by the Supreme Court two years ago. The high court had ruled that the physical act of installing a GPS device on a target’s vehicle amounted to a search, which usually necessitates a probable-cause warrant under the Fourth Amendment.

“Here, the monitoring of the location of the contraband-carrying vehicle as it crossed the country is no more of a comprehensively invasive search than if instead the car was identified in Arizona and then tracked visually and the search handed off from one local authority to another as the vehicles progressed. That the officers were able to use less expensive and more efficient means to track the vehicles is only to their credit,” the three-judge appellate panel of the 6th Circuit ruled 2-1.

According to the Post, “the NSA pulls in location data around the world from 10 major ‘sigads,’ or signals intelligence activity designators. A sigad known as STORMBREW, for example, relies on two unnamed corporate partners described only as ARTIFICE and WOLFPOINT. According to an NSA site inventory, the companies administer the NSA’s ‘physical systems,’ or interception equipment, and ‘NSA asks nicely for tasking/updates.’”

Regarding whether that’s legal, the 5th U.S. Circuit Court of Appeals — which covers Louisiana, Mississippi and Texas — in July sided with the government in a case involving three lower court rulings concerning unidentified suspects. A lower court said “compelled warrantless disclosure of cell site data violates the Fourth Amendment.”

The government argued that a mobile-phone company may disclose historical cell-site records created and kept by the company in its ordinary course of business, where such an order is based on a showing of “specific and articulable facts” that there are reasonable grounds to believe that the records sought are relevant and material to an ongoing criminal investigation. A court warrant, on the other hand, requires the higher probable-cause standard under the Fourth Amendment. The appeals court agreed. (.pdf)

The government’s argument is based on a 1979 Supreme Court ruling upholding a Maryland purse snatcher’s conviction. The conviction and 10-year term came after the cops compelled the phone company to make a record of the numbers dialed by defendant Michael Lee Smith. A warrant, the high court reasoned, was not required because people do not have a reasonable expectation that the records they maintain with businesses would be kept private.

That same case has provided the legal justification for the NSA’s massive phone-metadata snooping program.

Still, another appellate court to have ruled on the issue was the 3rd U.S. Circuit Court of Appeals. The appellate court said in 2010 that the lower courts have the option to demand a warrant for cell-site data. The court covers Delaware, New Jersey and Pennsylvania.

Meanwhile, U.S. District Judge Richard Bennet of Maryland last year cited the purse-snatching decision when declining to suppress evidence that Aaron Graham and Eric Jordan were allegedly involved in a string of Baltimore City fast-food restaurant robberies. They were arrested in connection to one robbery, and a 7-month historical look of their phone records placed them on the scene when other restaurants were robbed, the authorities said.

Bennet ruled:

For the following reasons, this Court concludes that the Defendants in this case do not have a legitimate expectation of privacy in the historical cell site location records (.pdf) acquired by the government. These records, created by cellular providers in the ordinary course of business, indicate the cellular towers to which a cellular phone connects, and by extension the approximate location of the cellular phone. While the implications of law enforcement’s use of this historical cell site location data raise the specter of prolonged and constant government surveillance, Congress in enacting the Stored Communications Act, has chosen to require only ‘specific and articulable facts’ in support of a government application for such records.

That decision is on appeal with the 4th U.S. Circuit Court of Appeals, which covers Virginia, West Virginia, North Carolina and South Carolina. Oral arguments are slated for next month.

View Source

A Night Watchman With Wheels?

The night watchman of the future is 5 feet tall, weighs 300 pounds and looks a lot like R2-D2 — without the whimsy. And will work for $6.25 an hour.

A company in California has developed a mobile robot, known as the K5 Autonomous Data Machine, as a safety and security tool for corporations, as well as for schools and neighborhoods.

“We founded Knightscope after what happened at Sandy Hook,” said William Santana Li, a co-founder of that technology company, now based in Sunnyvale, Calif. “You are never going to have an armed officer in every school.”

But what is for some a technology-laden route to safer communities and schools is to others an entry point to a post-Orwellian, post-privacy world.

“This is like R2-D2’s evil twin,” said Marc Rotenberg, the director of the Electronic Privacy and Information Center, a privacy rights group based in Washington.

And the addition of such a machine to the labor market could force David Autor, a Massachusetts Institute of Technology economist, to rethink his theory about how technology wrecks the middle class.

The minimum wage in the United States is $7.25, and $8 in California. Coming in substantially under those costs, Knightscope’s robot watchman service raises questions about whether artificial intelligence and robotics technologies are beginning to assault both the top and the bottom of the work force as well.

The K5 is the work of Mr. Li, a former Ford Motor Company executive, and Stacy Dean Stephens, a former police officer in Texas. They gained some attention in June for their failed attempt to manufacture a high-tech police cruiser at Carbon Motors Corporation in Indiana.

Knightscope plans to trot out K5 at a news event on Thursday — a debut that is certain to touch off a new round of debate, not just about the impact of automation, but also about how a new generation of mobile robots affects privacy.

The co-founders have chosen to position K5 not as a job killer, but as a system that will upgrade the role of security guard, even if fewer humans are employed.

“We want to give the humans the ability to do the strategic work,” said Mr. Li in a recent telephone interview, describing a highly skilled analyst who might control a herd of security robots.

The robot, which can be seen in a promotional video, is still very much a work in progress. The system will have a video camera, thermal imaging sensors, a laser range finder, radar, air quality sensors and a microphone. It will also have a limited amount of autonomy, such as the ability to follow a preplanned route. It will not, at least for now, include advanced features like facial recognition, which is still being perfected.

Knightscope settled in Silicon Valley because it was hoping for a warm reception from technology companies that employ large security forces to protect their sprawling campuses.

There are about 1.3 million private security guards in the United States, and they are low paid for the most part, averaging about $23,000 a year, according to the Service Employees International Union. Most are not unionized, so they are vulnerable to low-cost automation alternatives.

K5 also raises questions about mass surveillance, which has already set off intense debate in the United States and Europe with the expansion of closed-circuit television systems on city streets and elsewhere. The Knightscope founders, however, have a radically different notion, which involves crime prediction, or “precog” — a theme of the movie “Minority Report.”

“We have a different perspective,” Mr. Li said. “We don’t want to think about ‘RoboCop’ or ‘Terminator,’ we prefer to think of a mash up of ‘Batman,’ ‘Minority Report’ and R2-D2.”

Mr. Li envisions a world of K5 security bots patrolling schools and communities, in what would amount to a 21st-century version of a neighborhood watch. The all-seeing mobile robots will eventually be wirelessly connected to a centralized data server, where they will have access to “big data,” making it possible to recognize faces, license plates and other suspicious anomalies.

Mr. Rotenberg said such abilities would rapidly encroach on traditional privacy rights.

“There is a big difference between having a device like this one on your private property and in a public space,” he said. “Once you enter public space and collect images and sound recordings, you have entered another realm. This is the kind of pervasive surveillance that has put people on edge.”

Mr. Li said he believed he could circumvent those objections by making the data produced by his robots available to anyone in a community with access to the Internet.

“As much as people worry about Big Brother, this is as much about putting the technology in the hands of the public to look back,” he said. “Society and industry can work together on this issue.”

This is essentially a reprise of the debate over Google’s Street View system, which has drawn opposition from privacy advocates. But while Google’s cars captured still images infrequently, a pervasive video and audio portal that autonomously patrolled a neighborhood would in effect be a real-time Street View system.

For the moment, the system is unarmed, and it is certain to become the target of teenagers who will undoubtedly get a thrill from knocking the robot over. Mr. Li said he believed this was not an insurmountable challenge, given the weight, size and video-recording ability of the bots.

Mr. Rotenberg said a greater challenge would be community opposition. He acknowledged, however, that K5’s looks were benign enough. “It doesn’t look like Arnold Schwarzenegger,” he said. “Unless he was rolled over and pressed into a ball.”

View Source