INNOVATION: Digital Social Networking Tips For Stanford Students

SHARE: Share

THINGS TO DO, AND NOT DO, ON THE INTERNET...

How do people interact in the digital, non-physical world? Lucky for you, a Stanford research group got thousands of web users to write up the answers...and here they are: ONLINE_HUMAN_INTERACTION.pdf

 

With all that online interaction you will want to use good web hygiene and practice safe-webbing. There's a booklet for that too, produced by a big group of IT folks: PERSONAL INTERNET SECURITY – Rev 4.pdf

 

Who owns your face? Social media mobs raise new privacy concerns

In this Friday, Jan. 18, 2019, image made from video provided by the Survival Media Agency, a teenager wearing a “Make America Great Again” hat, center left, stands in front of an elderly Native American singing and playing a drum ... more >

ANALYSIS/OPINION:

You might own your car, your house, your pet and your 401(k). But you don’t own your own photographic image.

That’s one of the lessons of Rashomon on the Potomac, the bizarre fracas that occurred over the weekend on the National Mall involving Omaha elder Nathan Phillips, the Black Hebrew Israelites and a group of boys from Covington Catholic High School in Kentucky.

The matter shot to national attention after a short video was posted on Twitter depicting (part of) the incident. The focal point of the video was the image of a drumming Mr. Phillips standing up close to one of the students, who was donning a “Make America Great Again” cap.

The boy, it was widely said, was “smirking” throughout the encounter. That smirk was blasted across the globe. Eminences such as Reza Aslan, a creative writing professor who plays a religious historian on television, deemed the boy’s face “punchable” to his nearly 300,000 Twitter followers.

You’re not allowed to plaster a (punchable) photograph of Kanye West on a cereal box without his permission. That’s because “U.S. law has for many years recognized a right of [legal] action if a person’s image is used for commercial gain,” explains Peter Swire, a law professor at Georgia Tech and a longtime expert on privacy issues.

You can’t monetize somebody’s image without his permission or use his photo to even imply that the person supports a product. This is dubbed the “right of publicity.”

But absent blatant commercial use, you can distribute a person’s image. The First Amendment applies broadly here. It’s what allows, for instance, the tabloid media to operate — as Amazon honcho Jeff Bezos has unhappily learned recently.

And unlike libel law, which applies different legal standards to public and private figures, the First Amendment guarantees a right to distribute imagery without making any such distinction.

I’m allowed to photograph a celebrity and post it online as I am a random stranger. That’s why appalling websites such as “People of Walmart,” which holds up lower-middle class Americans for ridicule, are allowed.

Where the law does take exception is if one were to post a fake or doctored image. That behavior is legally actionable. So is distributing imagery that portrays events in a “false light” — where the image is true but presented in a dishonest or manipulative manner.

It’s said that certain cultures believe taking a photograph steals one’s soul. We don’t need to go that far without recognizing that there’s something unsettling about the notion that we don’t control the distribution of our own image.

Indeed, even within the framework of the First Amendment, exceptions have been carved out that recognize this fact, like the aforementioned right of publicity.

The criminalization of “revenge porn” — the online posting of prurient images of one’s former lovers — is another point of complication. Revenge porn has been banned in more than half of the states.

Yet, like the images shot on the National Mall, revenge porn is “true,” suggesting that First Amendment rights apply.

Danielle Citron, a law professor at the University of Maryland and leading proponent of revenge porn laws, argues that this is not so, noting in a paper that “certain categories of speech can be regulated due to their propensity to bring about serious harms and only slight contributions to First Amendment values.” That standard, of course, would seem to allow the criminalization of any manner of photographic distribution beyond revenge porn.

The Covington students, for instance, have faced death threats as a result of their images being distributed. This certainly qualifies as “serious harms.”

The First Amendment was crafted long before every American became an amateur photographer and broadcaster, walking around every day with a personal television studio in his pocket and, thanks to social media, a film distribution company, too. Privacy scholars should think of ways to give people more control over the distribution of their own image — whether it’s “punchable” or not.

-----------------------------------------------------------------

 

Amazon shareholders trying to prevent the company from selling its facial recognition software to police, citing privacy concerns

Amazon shareholders trying to prevent the company from selling its facial recognition software to police, citing privacy concerns (thehill.com)

------------------------------------------

BE CAREFUL OF YOUR FACE ON THE WEB

Face-tracking harvesters grab one picture of you and then use AI to find every other digital picture of you on the web. They open every social media post, resume, news clipping, dating account etc. and sell the full dossier on you to Axciom, the NSA, Political manipulators etc. and hack your bank accounts and credit cards. Never put an unsecured photo of yourself online. Anybody can take a screen grab of your photo on here, put it in Google's or Palantir's reverse image search, find all your other images and social media accounts online and get into your bank account or medical records in 30 minutes. The fact of the internet's failed security is in the headlines every day. The danger of posting pictures on the web is pretty clearly covered in every major newspaper. Fusion GPS, Black Cube and political operatives harvest every photo on here every hour and use the data to spy on people for political dirty tricks. The FBI, CIA, NSA and most 3-letter law enforcement spy operations copy everything on this site and analyze it. Don't you wonder why you never see anybody famous, political, in public service or in law on a dating site? Read Edward Snowden's book 'Permanent Record' or any weekly report at Krebs On Security. Huge numbers of the profiles on here are fake Nigerian scammer type things. 2D pictures have no bearing on 3D experiences of people in person. I am only interested in meeting people in person. Nobody has ever been killed at a Starbucks! There is nothing unsafe about meeting at a highly public Starbucks or Peets. I learned my lessons. There are hundreds of thousands of bait profiles on here. The real people show up for the coffee. The fake ones in Nigeria, and the political spies never show up in person and have a million carefully prepared excuses why not.

For example: Yandex is by far the best reverse image search engine, with a scary-powerful ability to recognize faces, landscapes, and objects. This Russian site draws heavily upon user-generated content, such as tourist review sites (e.g. FourSquare and TripAdvisor) and social networks (e.g. dating sites), for remarkably accurate results with facial and landscape recognition queries. To use Yandex, go to images.yandex.com, then choose the camera icon on the right. From there, you can either upload a saved image or type in the URL of one hosted online.

If you get stuck with the Russian user interface, look out for Выберите файл (Choose file), Введите адрес картинки (Enter image address), and Найти (Search). After searching, look out for Похожие картинки (Similar images), and Ещё похожие (More similar). The facial recognition algorithms used by Yandex are shockingly good. Not only will Yandex look for photographs that look similar to the one that has a face in it, but it will also look for other photographs of the same person (determined through matching facial similarities) with completely different lighting, background colors, and positions. Google and Bing also look for other photographs showing a person with similar clothes and general facial features, Yandex will search for those matches, and also other photographs of a facial match.

Any stranger could snap your picture on the sidewalk or on Match.com then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is currently being used by hundreds of law enforcement agencies in the US, including the FBI, says a report in The New York Times.

The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it's scraped off Facebook, Venmo, YouTube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.

The size of the Clearview database dwarfs others in use by law enforcement. The FBI's own database, which taps passport and driver's license photos, is one of the largest, with over 641 million images of US citizens.

Political spies have even better programs than this do...watch out! The web is not safe!

 

SHARE: Share

This post's comments feed

Add ping

Trackback URL : http://webco22.com/index.php?trackback/19