Google is not your friend, and with its Chrome browser and Maps, it could be your enemy

Depending how you feel about having your privacy being violated and getting scammed, you’re not going to like this latest information about Google.

Google Maps, which so many of us use to find locations and shop for services, is corrupted with false businesses, some of them scams, according to a lengthy Wall Street Journal investigation.

And Google Chrome, the Internet browser many of us switched to because it was faster and easier to use than Internet Explorer, is so cookie-friendly that the Washington Post calls it “surveillance software.”

There are ways around this.

Google Maps

First, let’s look at Google Maps.

Let’s say you need an emergency locksmith or a garage door repair company and you search Google. A map comes up as part of the search with virtual pins.

Only some of those pins aren’t for real businesses. They’re fronts for companies that ship leads to other companies, or, worse, they’re scam companies.

If you follow The Watchdog closely, this is not news to you. Two years ago, I shared the story of Shareen Grayson of Preston Hollow who unknowingly invited a convicted thief in to fix her freezer.

She found him on Google. A leads company had hijacked the phone number of a legitimate appliance business and passed it on to the thief.

Sad to say that two years later, Google hasn’t shut this scam down.

“The scams are profitable for nearly everyone involved,” the Wall Street Journal reports. “Google included. Consumers and legitimate businesses end up the losers.”

WSJ calls this “chronic deceit.”

Hundreds of thousands of false listings are posted to Google Maps and accompanying ads each month, the newspaper found.

Read More

Biased and wrong? Facial recognition tech in the dock

Police and security forces around the world are testing out automated facial recognition systems as a way of identifying criminals and terrorists. But how accurate is the technology and how easily could it and the artificial intelligence (AI) it is powered by – become tools of oppression?

Imagine a suspected terrorist setting off on a suicide mission in a densely populated city centre. If he sets off the bomb, hundreds could die or be critically injured.

CCTV scanning faces in the crowd picks him up and automatically compares his features to photos on a database of known terrorists or “persons of interest” to the security services.

The system raises an alarm and rapid deployment anti-terrorist forces are despatched to the scene where they “neutralise” the suspect before he can trigger the explosives. Hundreds of lives are saved. Technology saves the day.

But what if the facial recognition (FR) tech was wrong? It wasn’t a terrorist, just someone unlucky enough to look similar. An innocent life would have been summarily snuffed out because we put too much faith in a fallible system.

What if that innocent person had been you?

This is just one of the ethical dilemmas posed by FR and the artificial intelligence underpinning it.

Training machines to “see” – to recognise and differentiate between objects and faces – is notoriously difficult. Computer vision, as it is sometimes called – not so long ago was struggling to tell the difference between a muffin and a chihuahua – a litmus test of this technology.

Read More

FBI, ICE find state driver’s license photos are a gold mine for facial-recognition searches

Agents with the Federal Bureau of Investigation and Immigration and Customs Enforcement have turned state driver’s license databases into a facial-recognition gold mine, scanning through millions of Americans’ photos without their knowledge or consent, newly released documents show.

Thousands of facial-recognition requests, internal documents and emails over the past five years, obtained through public-records requests by researchers with Georgetown Law’s Center on Privacy and Technology and provided to The Washington Post, reveal that federal investigators have turned state departments of motor vehicles databases into the bedrock of an unprecedented surveillance infrastructure.

Police have long had access to fingerprints, DNA and other “biometric data” taken from criminal suspects. But the DMV records contain the photos of a vast majority of a state’s residents, most of whom have never been charged with a crime.

Neither Congress nor state legislatures have authorized the development of such a system, and growing numbers of Democratic and Republican lawmakers are criticizing the technology as a dangerous, pervasive and error-prone surveillance tool.

“Law enforcement’s access of state databases,” particularly DMV databases, is “often done in the shadows with no consent,” House Oversight Committee Chairman Elijah E. Cummings (D-Md.) said in a statement to The Post.

Read More

Mass Surveillance Is Coming to a City Near You

The tech entrepreneur Ross McNutt wants to spend three years recording outdoor human movements in a major U.S. city, KMOX news radio reports.

If that sounds too dystopian to be real, you’re behind the times. McNutt, who runs Persistent Surveillance Systems, was inspired by his stint in the Air Force tracking Iraqi insurgents. He tested mass-surveillance technology over Compton, California, in 2012. In 2016, the company flew over Baltimore, feeding information to police for months (without telling city leaders or residents) while demonstrating how the technology works to the FBI and Secret Service.

The goal is noble: to reduce violent crime.

There’s really no telling whether surveillance of this sort has already been conducted over your community as private and government entities experiment with it. If I could afford the hardware, I could legally surveil all of Los Angeles just for kicks.

And now a billionaire donor wants to help Persistent Surveillance Systems to monitor the residents of an entire high-crime municipality for an extended period of time––McNutt told KMOX that it may be Baltimore, St. Louis, or Chicago.

McNutt’s technology is straightforward: A fixed-wing plane outfitted with high-resolution video cameras circles for hours on end, recording everything in large swaths of a city. One can later “rewind” the footage, zoom in anywhere, and see exactly where a person came from before or went after perpetrating a robbery or drive-by shooting … or visiting an AA meeting, a psychiatrist’s office, a gun store, an abortion provider, a battered-women’s shelter, or an HIV clinic. On the day of a protest, participants could be tracked back to their homes.

In the timely new book Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All, the author Arthur Holland Michel talks with people working on this category of technology and concludes, “Someday, most major developed cities in the world will live under the unblinking gaze of some form of wide-area surveillance.”

At first, he says, the sheer amount of data will make it impossible for humans in any city to examine everything that is captured on video. But efforts are under way to use machine learning and artificial intelligence to “understand” more. “If a camera that watches a whole city is smart enough to track and understand every target simultaneously,” he writes, “it really can be said to be all-seeing.”

Read More

Cellebrite Cracks All iOS Devices, Company Announces

The “arms race” of mobile forensics – ever-tougher encryption and the breakneck operations to crack it – has become more of a public tug-of-war than ever before.

Cellebrite, the largest player in the mobile-forensics industry, unveiled its UFED Premium last Friday. Along with the announcement came the bombshell: that it can now get into any Apple iOS device, and many of the high-end Android devices.

“An exclusive solution for law enforcement to unlock and extract data from all iOS and Android devices,” the company said in a tweet.

Those devices have historically been the toughest to crack – and Cellebrite’s newfound ability to perform a full-file system extraction on any iOS device in particular would allow law enforcement “to get much more data than what is possible through logical extractions and other conventional means.”

“Our certified forensic experts can also help you gain access to sensitive mobile evidence form several locked, encrypted or damaged iOS and Android devices using advanced in-lab only techniques,” the company added in its Friday announcement.

The latest tool works on Apple device running anything from iOS 7 to iOS 12.3, according to the company. Among the Android devices covered are the Samsung S6, S7, S8, and S9. Also supported are the most popular models of Motorola, Huawei, LG and Xiaomi.

The announcement follows the highly-publicized breakthrough of the GrayKey devices made by Grayshift more than a year ago. The GrayKey tool had exploited a low-power loophole in some iOS systems, one expert explained to Forensic Magazine. But Apple put in a fix to stop the access late last year, involving an iOS system to reconnect with a home device. Since then, GrayKey has made some inroads on some Apple devices – but not all of them, according to experts.

Read More

Creator of Website That Stole ATM Card Numbers Sentenced

Nearly half a million Alabama cell phone numbers received identical text messages in 2015 telling them to click a link to “verify” their bank account information. The link took recipients to a realistic-looking bank website where they typed in their personal financial information.

But the link was not the actual bank’s website—it was part of a phishing scam. Just like phishing messages sent over email, the text message-based scam was easy to fall for. The web address was only one character off from the bank’s actual web address.

While most recipients appeared to ignore the message, around 50 people clicked on the link and provided their personal information. The website asked for account numbers, names, and ZIP codes, along with their associated debit card numbers, security codes, and PINs. Within an hour, the fraudster had made himself debit cards with the victims’ account information. He then began to withdraw money from various ATMs, stealing whatever the daily ATM maximum was from each account.

“It was a fairly legitimate-looking website, other than the information it was asking for,” said Special Agent Jake Frith of the Alabama Attorney General’s Office, who worked the case along with investigators from the FBI’s Mobile Field Office.

The fraudster, Iosif Florea, stole about $18,000 (including ATM fees), with losses from each individual account ranging from $20 to $800. (Banks typically reimburse customers who are victims of fraud.)

Investigators believe Florea bought a large list of cell phone numbers from a marketing company, and he only needed a few victims out of thousands of phone numbers for the scheme to be successful.

The damage was minimized, however, because of the bank’s quick response. As soon as customers reported the fraud, the bank reached out to federal authorities as well as the local media to alert the community to the fraudulent messages.

Read More

School’s Plan for Facial Recognition System Raises Concerns

A New York school district has finished installing a facial recognition system intended to spot potentially dangerous intruders, but state officials concerned about privacy say they want to know more before the technology is put into use.

Education Department spokeswoman Emily DeSantis said Monday that department employees plan to meet with Lockport City School officials about the system being tested this week. In the meantime, she said, the district has said it will not use facial recognition software while it checks other components of the system.

The rapidly developing technology has made its way into airports, motor vehicle departments, stores and stadiums, but is so far rare in public schools.

Lockport is preparing to bring its system online as cities elsewhere are considering reining in the technology’s use. San Francisco in May became the first U.S. city to ban its use by police and other city departments and Oakland is among others considering similar legislation.

A bill by Democrat Assembly Member Monica Wallace would create a one-year moratorium on the technology’s use in New York schools to allow lawmakers time to review it and draft regulations. The legislation is pending.

Lockport Superintendent Michelle Bradley, on the district’s website, said the district’s initial implementation of the system this week will include adjusting cameras mounted throughout the buildings and training staff members who will monitor them from a room in the high school. The system is expected to be fully online on Sept. 1.

Read More

It’s the middle of the night. Do you know who your iPhone is talking to

It’s 3 a.m. Do you know what your iPhone is doing?

Mine has been alarmingly busy. Even though the screen is off and I’m snoring, apps are beaming out lots of information about me to companies I’ve never heard of. Your iPhone probably is doing the same — and Apple could be doing more to stop it.

On a recent Monday night, a dozen marketing companies, research firms and other personal data guzzlers got reports from my iPhone. At 11:43 p.m., a company called Amplitude learned my phone number, email and exact location. At 3:58 a.m., another called Appboy got a digital fingerprint of my phone. At 6:25 a.m., a tracker called Demdex received a way to identify my phone and sent back a list of other trackers to pair up with.

And all night long, there was some startling behavior by a household name: Yelp. It was receiving a message that included my IP address -— once every five minutes.

Our data has a secret life in many of the devices we use every day, from talking Alexa speakers to smart TVs. But we’ve got a giant blind spot when it comes to the data companies probing our phones.

You might assume you can count on Apple to sweat all the privacy details. After all, it touted in a recent ad, “What happens on your iPhone stays on your iPhone.” My investigation suggests otherwise.

IPhone apps I discovered tracking me by passing information to third parties — just while I was asleep — include Microsoft OneDrive, Intuit’s Mint, Nike, Spotify, The Washington Post and IBM’s the Weather Channel. One app, the crime-alert service Citizen, shared personally identifiable information in violation of its published privacy policy.

And your iPhone doesn’t only feed data trackers while you sleep. In a single week, I encountered over 5,400 trackers, mostly in apps, not including the incessant Yelp traffic. According to privacy firm Disconnect, which helped test my iPhone, those unwanted trackers would have spewed out 1.5 gigabytes of data over the span of a month. That’s half of an entire basic wireless service plan from AT&T.

“This is your data. Why should it even leave your phone? Why should it be collected by someone when you don’t know what they’re going to do with it?” says Patrick Jackson, a former National Security Agency researcher who is chief technology officer for Disconnect. He hooked my iPhone into special software so we could examine the traffic. “I know the value of data, and I don’t want mine in any hands where it doesn’t need to be,” he told me.

Read More

Not Only Can Alexa Eavesdrop — She Can Also Testify Against You

When it was revealed last month that a team of Amazon workers were tasked with listening to and reviewing Echo customers’ recordings—including those that customers never intended to record—the news sparked a flurry of criticism and concern regarding what this meant for the average consumer’s privacy.

At the same time, many were left unsurprised. Previous incidents, such as when an Amazon customer in Germany accidentally received someone else’s private Alexa recordings last year, have shown not only that the devices can record when least expected (such as when the user is in the shower, or having a private conversation) but also that these recordings can end up in unexpected hands.

This reality can leave users feeling that the device that helps them control their schedule, their music and even their home appliances isn’t completely within their control. In fact, the Echo can even be used against its owner—and may have the potential to send some users to prison.

As explained by Oxygen Forensics COO Lee Reiber in an interview with Forensic Magazine, when you live with an Alexa device, “it’s almost like your room is bugged.” Of course the “almost” is that Alexa isn’t necessarily always recording, but that doesn’t mean it only records when it’s supposed to either.

“We have a sample Alexa (…) that I utilize to do research on, and there is a lot of information on there. And I found several (recordings) that are specifically marked by Amazon as an error,” said Reiber, who has firsthand experience using Oxygen’s digital forensic tools to extract data from Echo devices. “I’m sitting there in my kitchen and I am talking to my wife, and it’s recording that information.”

Echo devices are meant to record what the user says to it after using a “wake word”—either “Echo,” “Amazon,” “computer” or the classic “Alexa,” depending on what the user prefers. The catch is that Alexa, which always has its microphone on listening for that word, has a habit of mishearing other words or sounds as its wake word, causing it to activate and record the voices or noises that follow.

I’ve noticed this with my own Echo Dot device, which sometimes lights up blue on its own, or startles me with a robotic “I’m sorry, I didn’t catch that” when I never said anything to begin with. Reiber also said those kitchen conversations with his wife were recorded without permission from a wake word, and plenty of other users have reported similar experiences with accidentally waking up their all-hearing assistant.

As Reiber explained, Amazon typically marks unintentional recordings as an error, and in forensic tools like Oxygen’s extractor, they show up marked as discarded items, similar to files someone has deleted from their phone or computer but are still there in the device’s memory. And like these unseen “deleted” files that any skilled digital examiner can recover and view, those accidental recordings are still available to investigators in full—and have the potential to become valuable forensic evidence in a case.

“Because they are already recording, any of these types of IoT (internet of things) devices can be tremendous, because again, if it’s still listening, it could record, and the quality is fantastic,” said Reiber, who also has a law enforcement background. “It’s just a great recording of the person who’s actually speaking. So, someone could say, ‘Well, it wasn’t me, it wasn’t me talking.’ Well, no, it is, it’s an exact recording of your voice.”

Read More

New Forensic Technology Examines Fingerprints Once Considered Too Old

CARLISLE, Pa. (WHTM) - New technology allows investigators to examine fingerprints that were once considered too old or compromised to analyze.

A vacuum metal deposition instrument is now in the hands of Cumberland County to better collect fingerprints and DNA. This equipment is only the second of its kind in Pennsylvania and one of 14 in the entire country.

“Gold will deposit on the substrate and then I will run zinc, and zinc doesn’t attach anywhere else except to a different metal and then it will attach to the gold that I’ve placed,” said Carol McCandless, the lead forensic investigator for Cumberland County.

The vacuum sucks all the air and water out of a chamber, then the machine coats the evidence with a very thin metal film under a high vacuum, all done in less than five minutes.

“The metallic substances don’t land on the top of the ridges of anything. It goes in between so that the top of the ridge is touched and that’s where the DNA is,” said Skip Ebert, Cumberland County District Attorney.

This machine locates fingerprints from items that were previously tough or impossible to extract before, things like paper, waxy substances, and clothing.

“What we did in this machine of the actual victim’s face being suffocated and on the opposite side of the pillowcase, the actual hands that were pushing it down on his face. You cannot beat that kind of evidence anywhere,” said Ebert.

This not only helps current and future cases.

“I have received several cold cases, one from 1983 and one from 1995,” said McCandless.

The Cumberland County Forensic Lab is expected to be fully accredited in the fall, thanks largely to this new technology, made possible through a grant from the Pennsylvania Commission on Crime and Delinquency.

View Source