A WORLD BECOMING LESS PRIVATE
Airbnb has recently developed trait analyzing software powered by Artificial Intelligence. This comes in the wake of a host in London, who says guests wrecked their properties with rowdy parties after renting the property for a baby shower. Incidents like this present a PR problem for Airbnb so they seek to decrease their prevalence by screening the personality of users to determine, whether they can be trusted with the property. In an ever connected digital world where privacy seems to be a thing of the past, we must be very careful with what we put online or we may find ourselves rejected by Airbnnb.
BACKGROUND CHECK TECHNOLOGY
This background check technology was revealed in a patent published by the European Patent Office, after being granted in the US last year. According to the patent, Airbnb deploys AI powered web crawlers to scan the social media sites of users for various personality traits. Aside from the usual credit and identity checks the software will also look for traits such as conscientiousness and openness. It will also look for what they consider negative traits such as neurotocism and machiavellianism or psychopathy to determine trustworthiness.
The software will also look for a trait they refer to as “involvement in crimes” which of course is not a personality trait at all. This just means they don’t want to see photos of you on your social media engaging in illegal activities or posts where you discuss engaging in illegal activities. The software even goes farther than that flagging those who are found to be associated with fake social network profiles or those who have given any false details. Which means those of you who have social profiles with less than honest personal information on them could be unjustly flagged as untrustworthy.
DISCRIMINATION AND ETHICAL CONCERNS
The AI tool will also score you poorly if keywords, images or video associated with you are involved with drugs or alcohol, hate websites or organizations, or sex work. This an interesting piece of information in regards to hate websites or organizations. Who defines hate exactly? In this case it would be Airbnb. All of a sudden the potential for serious discrimination and misuse on the part of Airbnb will have to be called into question.
It doesn’t stop there. This machine learning tool also scans news stories that could be about the person related to a crime and can place a “weight” on the offense. This attempts to compile a data set reflecting how a person acts offline cross-referencing these metrics with generic things such as employment and education history.
This is what is says on the Airbnb website. “Every Airbnb reservation is scored for risk before it’s confirmed. We use predictive analytics and machine learning to instantly evaluate hundreds of signals that help us flag and investigate suspicious activity before it happens”. These are the kinds of decisions humans use to exercise judgment on. Now we are becoming overly reliant on machine learning and it is raising a lot of concerns.
HOARDING DATA ON USERS
On the surface, a lot of the things that this machine learning tool looks for in potential users don’t seem at all relevant to whether or not they will destroy someone’s property. Traits like neuroticism, openness, and extraversion do not significantly correlate to an individuals level of responsibility or maturity, traits such as psychopathy are so rare they aren’t even worth discussing, and flagging users online who are associated with “hate” is silly as there is no good way to define what constitutes “hate”. The only personality trait that would be relevant in this case is conscientiousness since this does in part determine how reliable a person is. However, no machine learning tool out there can accurately quantify the level of conscientiousness present in a person based on their social media posts. The idea is laughable at best and unethical at worst.
It seems if the only intention on the part of Airbnb here was to filter out unreliable users to prevent property damage all they would actually need is a person’s credit score, employment history. In other words, a standard background check would suffice. The tool that Airbnb has created goes far beyond that, attempting to psychoanalyze a person based on social media posts and cherry pick arbitrary reasons to deny them use of a property. This is just another excuse by a tech company to gather excessive data on its users to sell them to third parties.
The takeaway here is to exercise increased caution when posting details of your life onto social media. You never know when a company will use arbitrary bits of information to deny you service.