• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Guardian: Google, democracy and the truth about internet search

Status
Not open for further replies.

Totakeke

Member
UFRbCB9.png

Are Jews evil? It’s not a question I’ve ever thought of asking. I hadn’t gone looking for it. But there it was. I press enter. A page of results appears. This was Google’s question. And this was Google’s answer: Jews are evil. Because there, on my screen, was the proof: an entire page of results, nine out of 10 of which “confirm” this. The top result, from a site called Listovative, has the headline: “Top 10 Major Reasons Why People Hate Jews.” I click on it: “Jews today have taken over marketing, militia, medicinal, technological, media, industrial, cinema challenges etc and continue to face the worlds [sic] envy through unexplained success stories given their inglorious past and vermin like repression all over Europe.”

Stories about fake news on Facebook have dominated certain sections of the press for weeks following the American presidential election, but arguably this is even more powerful, more insidious. Frank Pasquale, professor of law at the University of Maryland, and one of the leading academic figures calling for tech companies to be more open and transparent, calls the results “very profound, very troubling”.

He came across a similar instance in 2006 when, “If you typed ‘Jew’ in Google, the first result was jewwatch.org. It was ‘look out for these awful Jews who are ruining your life’. And the Anti-Defamation League went after them and so they put an asterisk next to it which said: ‘These search results may be disturbing but this is an automated process.’ But what you’re showing – and I’m very glad you are documenting it and screenshotting it – is that despite the fact they have vastly researched this problem, it has gotten vastly worse.”

Next I type: “a-r-e m-u-s-l-i-m-s”. And Google suggests I should ask: “Are Muslims bad?” And here’s what I find out: yes, they are. That’s what the top result says and six of the others. Without typing anything else, simply putting the cursor in the search box, Google offers me two new searches and I go for the first, “Islam is bad for society”. In the next list of suggestions, I’m offered: “Islam must be destroyed.”

Jews are evil. Muslims need to be eradicated. And Hitler? Do you want to know about Hitler? Let’s Google it. “Was Hitler bad?” I type. And here’s Google’s top result: “10 Reasons Why Hitler Was One Of The Good Guys” I click on the link: “He never wanted to kill any Jews”; “he cared about conditions for Jews in the work camps”; “he implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

A few days later, I talk to Danny Sullivan, the founding editor of SearchEngineLand.com.He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”

But it seems the implications about the power and reach of these companies is only now seeping into the public consciousness. I ask Rebecca MacKinnon, director of the Ranking Digital Rights project at the New America Foundation, whether it was the recent furore over fake news that woke people up to the danger of ceding our rights as citizens to corporations. “It’s kind of weird right now,” she says, “because people are finally saying, ‘Gee, Facebook and Google really have a lot of power’ like it’s this big revelation. And it’s like, ‘D’oh.’”

MacKinnon has a particular expertise in how authoritarian governments adapt to the internet and bend it to their purposes. “China and Russia are a cautionary tale for us. I think what happens is that it goes back and forth. So during the Arab spring, it seemed like the good guys were further ahead. And now it seems like the bad guys are. Pro-democracy activists are using the internet more than ever but at the same time, the adversary has gotten so much more skilled.”

Last week Jonathan Albright, an assistant professor of communications at Elon University in North Carolina, published the first detailed research on how rightwing websites had spread their message. “I took a list of these fake news sites that was circulating, I had an initial list of 306 of them and I used a tool – like the one Google uses – to scrape them for links and then I mapped them. So I looked at where the links went – into YouTube and Facebook, and between each other, millions of them… and I just couldn’t believe what I was seeing.

“They have created a web that is bleeding through on to our web. This isn’t a conspiracy. There isn’t one person who’s created this. It’s a vast system of hundreds of different sites that are using all the same tricks that all websites use. They’re sending out thousands of links to other sites and together this has created a vast satellite system of rightwing news and propaganda that has completely surrounded the mainstream media system.

Is bias built into the system? Does it affect the kind of results that I was seeing? “There’s all sorts of bias about what counts as a legitimate source of information and how that’s weighted. There’s enormous commercial bias. And when you look at the personnel, they are young, white and perhaps Asian, but not black or Hispanic and they are overwhelmingly men. The worldview of young wealthy white men informs all these judgments.”

Later, I speak to Robert Epstein, a research psychologist at the American Institute for Behavioural Research and Technology, and the author of the study that Martin Moore told me about (and that Google has publicly criticised), showing how search-rank results affect voting patterns. On the other end of the phone, he repeats one of the searches I did. He types “do blacks…” into Google.

“Look at that. I haven’t even hit a button and it’s automatically populated the page with answers to the query: ‘Do blacks commit more crimes?’ And look, I could have been going to ask all sorts of questions. ‘Do blacks excel at sports’, or anything. And it’s only given me two choices and these aren’t simply search-based or the most searched terms right now. Google used to use that but now they use an algorithm that looks at other things. Now, let me look at Bing and Yahoo. I’m on Yahoo and I have 10 suggestions, not one of which is ‘Do black people commit more crime?'

“And people don’t question this. Google isn’t just offering a suggestion. This is a negative suggestion and we know that negative suggestions depending on lots of things can draw between five and 15 more clicks. And this all programmed. And it could be programmed differently.”


What Epstein’s work has shown is that the contents of a page of search results can influence people’s views and opinions. The type and order of search rankings was shown to influence voters in India in double-blind trials. There were similar results relating to the search suggestions you are offered.

“The general public are completely in the dark about very fundamental issues regarding online search and influence. We are talking about the most powerful mind-control machine ever invented in the history of the human race. And people don’t even notice it.”

https://www.theguardian.com/technology/2016/dec/04/google-democracy-truth-internet-search-facebook
 

El Topo

Member
It's amazing how these companies have fooled the public into believing that they should be left alone and be free from all responsibility or scrutiny.
 

The Technomancer

card-carrying scientician
Yes. Before the rise of social media, before the advent of right wing memes before all of it there was the original problem of the internet: any jackass can make a page that looks just credible enough that a good portion of the population doesn't reflexively recognize it as fake, turning the internet into a misinformation machine. Or in this case the pages themselves don't even need to exist, just the results in the search page are enough to make any claim or opinion credible

I think its time for us to really realize that the ideals of the information superhighway in which access to all viewpoints and all information indiscriminately (or in google's case, discriminated by an algorithgm that cares about "popularity" and "usefulness", not "accuracy") did not just fail, but it was in fact conceptually broken
 

Cipherr

Member
It's amazing how these companies have fooled the public into believing that they should be left alone and be free from all responsibility or scrutiny.

Is that what they have really done? I have seen nothing in my circle but criticism for these companies from Twitter to Google for the fact that they don't filter hate speech and shit nearly enough. It just feeds into the post truth nonsense whether intentional or not. Top it all off with the fake news wave and you have a serious problem.
 

ant_

not characteristic of ants at all
This is a really difficult situation with a really difficult answer.

I agree with the article in the OP. There are certain elements that should absolutely be removed (the suggestions, for example).

I do feel fearful of allowing Google to have an impartial algorithm for delivering search results.
 
Strange, I don't get suggestions for most "are [group here]" searches. Might be a local thing or something.

But it is a difficult topic. Do you remove those results, who decides what to include. That has been going on since search engines started, but recent events put it in the spotlight again. And rightfully so.
 

Somnid

Member
Why would you search "Do black people commit more crimes?" are you genuinely curious? Or are your really looking to confirm your bias?

Once you get a result what do you do with it, do you read it for knowledge or perspective? Do you pick the ones you personally like? The most outrageous headline?

Most people understand Google is a consensus machine, now what if search results are modified to bump or drop results, somehow, for some reason? Did you know governments can compel companies like Facebook and Google to meet their own morals?
 

Krejlooc

Banned
When I type in "are jews" the suggestions are:

"White"
"Christian"
"a race"

I'm pretty sure these are influenced by your local machine, not a global suggestion.
 

Alx

Member
Most people understand Google is a consensus machine

I'm not even sure that people understand how it works, but many people seem to consider it as an "answer everything" machine. So it can definitely influence the general opinion. And by design, search engines are creating a hivemind, giving answer not based on any specific relevance, but on what's "trending" on a global scale.
There's definitely something worth worrying about, even discarding the controversial results that may or may not appear at the top of the list.
 

Nikodemos

Member
The crap-ass algorithms they use also tend to bring results based on the person's previous searches, so if a fringer nutbag has previously visited fringer nutbaggy sites, the search logic will bring forth more kooky bullshit, creating a mental/intellectual echo chamber.
 
Google search is in the business of giving people what they are looking for. The truth is that this is a sad reflection of people, not a sad reflection of Google, the algorithm, or anything else.

People want to find "Are black people lazy" or "Are feminists crazy" or "Are cops racist" or "Are white people evil."

The real problem with Google (or the Internet) as I see it is that you can find a justification for anything. You can state anything, no matter how factual or counter-factual it is, and then you can find an article, website, or story that justifies it or "proves" it to you. Are cops racist? Well, of course, there's a million articles about it. Are black people lazy? Well of course, there's a million websites about it. Are feminists crazy? Of course, theres a billion results showing they are. Did Jews cause 9/11? Of course they did, there's 100,000 ,000 results. Did FDR allow Pearl Harbor to happen to justify war? Of course, look at the top result. Was JFK assassinated by the CIA? Sure he was, look at the results. Is this really a problem with the internet or Google? No, it's a problem with people.

The truth is that the narrative -- blacks being lazy, whites being evil, feminists being crazy, Jews doing 9/11, FDR causing WW2, the CIA assassinating JFK -- existed before the Google search, and the Google search is trying to find the best results of what people want to see. Google is in the business of trying to give people what they want, and, depressingly, this is what people want.

But don't think that because you're not racist or anti-semitic that you're not contributing to the problem, it goes into everything: Is the PS4 Pro better than the Xbox Scorpio, is skim milk bad for you, is tipping unethical, are 401(k)'s bad investments, are engagement rings a scam, is telling your kids about Santa bad for them. These are all topics that have appeared on NeoGaf within the last couple days, all of which have people that want to believe a particular thing. SOmebody wants to believe that skim milk is bad for you, that the Scorpio is worse than the PS4 Pro, that engagement rings are a scam, that tipping is unethical. You want to believe that and Google is in the business of finding content that you think is relevant. You think that content is relevant because it is justifying a bias or preconceived notion that you have. If Google did perhaps the right thing here, then every result for all of those searches would be "Well... It depends."

And you know what you'd do? You'd use Bing.

*edit*

There is a massive vacuum of truth, though. But this isn't Google's fault. There are no sources of truth anymore. Nobody agrees on who is a source of truth. "Hillary Clinton is a liar." "No, Donald Trump is a liar." "Well Politifact says that Trump lies more than Clinton." "Politifact is biased against Trump, FOX says that Hillary lies the most, well FOX is biased against Clinton..." etc. It goes well beyond politics, but politics are the most egregious. I think this Democratic primary was when I realized we're just fucked when it comes to sources of truth. Sure, Republicans and Democrats can agree on no central source of truth, but in this primary, you saw Democrats claiming that the New York Times was biased and couldn't be trusted, that The Young Turks are biased and can't be trusted, that KOS is bias, that CNN is biased, that NPR is biased. Now, these sources may be biased but when even people who largely agree on almost everything can't agree to a single source of truth, it creates this cavernous vacuum that can be filled by anything, and that's how you have these counter-factual bull shit websites and news stories pop up. The post-fact websites, fake news sites, and what have you, have filled the void that we created by determining that everything that doesn't meet our preconceived bias is wrong or can't be trusted. When you can't trust NPR, the NYT, AP, and anything else, then it gives an opportunity for peddlers of falsehoods to fill the void, and I've never heard of ProgressNews, RawStory, or YoungCons, or what have you, and nobody is saying they're biased because we've never heard of them, and so maybe they can be trusted more than the NYT... And that's why we're here.
 

Somnid

Member
I'm not even sure that people understand how it works, but many people seem to consider it as an "answer everything" machine. So it can definitely influence the general opinion. And by design, search engines are creating a hivemind, giving answer not based on any specific relevance, but on what's "trending" on a global scale.
There's definitely something worth worrying about, even discarding the controversial results that may or may not appear at the top of the list.

I don't think people confuse search suggestions with answers to "do black people..". Actually type in "Do black people commit more crimes" and read the links.
 
Google search is in the business of giving people what they are looking for. The truth is that this is a sad reflection of people, not a sad reflection of Google, the algorithm, or anything else.

People want to find "Are black people lazy" or "Are feminists crazy" or "Are cops racist" or "Are white people evil."

The real problem with Google (or the Internet) as I see it is that you can find a justification for anything. You can state anything, no matter how factual or counter-factual it is, and then you can find an article, website, or story that justifies it or "proves" it to you. Are cops racist? Well, of course, there's a million articles about it. Are black people lazy? Well of course, there's a million websites about it. Are feminists crazy? Of course, theres a billion results showing they are. Did Jews cause 9/11? Of course they did, there's 100,000 ,000 results. Did FDR allow Pearl Harbor to happen to justify war? Of course, look at the top result. Was JFK assassinated by the CIA? Sure he was, look at the results. Is this really a problem with the internet or Google? No, it's a problem with people.

The truth is that the narrative -- blacks being lazy, whites being evil, feminists being crazy, Jews doing 9/11, FDR causing WW2, the CIA assassinating JFK -- existed before the Google search, and the Google search is trying to find the best results of what people want to see. Google is in the business of trying to give people what they want, and, depressingly, this is what people want.

But don't think that because you're not racist or anti-semitic that you're not contributing to the problem, it goes into everything: Is the PS4 Pro better than the Xbox Scorpio, is skim milk bad for you, is tipping unethical, are 401(k)'s bad investments, are engagement rings a scam, is telling your kids about Santa bad for them. These are all topics that have appeared on NeoGaf within the last couple days, all of which have people that want to believe a particular thing. SOmebody wants to believe that skim milk is bad for you, that the Scorpio is worse than the PS4 Pro, that engagement rings are a scam, that tipping is unethical. You want to believe that and Google is in the business of finding content that you think is relevant. You think that content is relevant because it is justifying a bias or preconceived notion that you have. If Google did perhaps the right thing here, then every result for all of those searches would be "Well... It depends."

And you know what you'd do? You'd use Bing.

Yeah but that doesn't mean they shouldn't be trying to do better. Google's already pretty much perfected the art of showing you the most popular / relevant search results. Their big thing now is consolidating and synthesizing information for you. Giving you the answer to a question rather than just links to sites that probably have the answer, for instance. They want to move from being an intermediary to being an authoritative source for information. But as of now their algorithms are drawing from shitty sources and probably being too gung-ho about trying to provide you information, and eventually that's gonna lead to them serving you paragraphs from white supremacist sites as if they were the definitive answer. Hopefully the past two months have been a wake-up call for Google and they've dedicated a lot more resources to the problem. It's a difficult technical challenge but these systems can be significantly better than they are today.
 

kess

Member
This brings into relief the importance of privacy from Google, and why these companies have fought so hard against the EU implementing these rules. It upends their entire business model.
 
Out of curiosity, If I enter "do blacks..." I get:

do blacksmith elixirs stack
do blacksmiths still exist
do blacksmith potions stack
do blacksmiths still exist today
do blacksmith potions stack skyrim

I don't even play skyrim.
 

G.ZZZ

Member
The problem with google doing something about this is that it would give them a lot of implicit control. Which may be good today, but will it be tomorrow? It's pretty hard. The only thing i could see is something like a certification for actual real news and informations around the internet. Still it's a shitshow and i don't really want to be the one sorting this out.
 
are atheists right/going to hell/heathens/tax exempt (wtf)

Result of searching for going to hell is a report from a print newspaper saying that it is confined that atheists are going to hell.

On the 1st page is a hate article from a Christian source saying that there are no atheists in hell. That's because when they go to hell they will have to believe and will no longer be atheists! Funny!
 

royalan

Member
We are supposed to blame google for what people are searching?

No, but we can definitely blame them for how they compile and curate the information that people search for. That is entirely on them.

Pure search is the only way to go, the more I think about it. Just present the relevant results that people search for. Stop trying to read your users' minds. Because your algorithms fucking suck, you fuck up, and it has serious consequences on a macro level.

I've trained myself to complete ignore the search suggestions because half the time they have no relevance to what I'm looking for, and the other half of the time the suggestions or racist/sexist/homophobic/stupid-as-fuck. This isn't new. This problem with google has been meme'd for years. We've had threads on stupid search suggestions on GAF. Google just saw fit to ignore the problem until this year.
 

Cipherr

Member
No, but we can definitely blame them for how they compile and curate the information that people search for. That is entirely on them.

Pure search is the only way to go, the more I think about it. Just present the relevant results that people search for. Stop trying to read your users' minds. Because your algorithms fucking suck, you fuck up, and it has serious consequences on a macro level.

I've trained myself to complete ignore the search suggestions because half the time they have no relevance to what I'm looking for, and the other half of the time the suggestions or racist/sexist/homophobic/stupid-as-fuck. This isn't new. This problem with google has been meme'd for years. We've had threads on stupid search suggestions on GAF. Google just saw fit to ignore the problem until this year.

Nah, Im not down for losing my personalized results. Its far to effective for me, especially professionally. I wouldnt mind if they made you opt in for personalized results (which I believe is the purpose of you creating a google account so they can track your searches over years to get an idea of what you are looking for). But I don't want them going away completely because some folks can't understand context.

Make it opt in even more than it already is.
 
Google search is in the business of giving people what they are looking for. The truth is that this is a sad reflection of people, not a sad reflection of Google, the algorithm, or anything else.

I'm sorry, but that's bullshit.

It DIRECTLY reflects Google's ranking algorithm after crawling and indexing has happened. That's exactly what "alt-right" teens and twenties do understand that you currently don't. Because the algorithm doesn't take an exact position on morality, it promotes the immoral positions if they can be made to be ranking as valuable search results. Which is really easy to do since the relevant links tend to determine the rank, even if Google tries to avoid their algorithm doing only that.
It's also somewhat unsurprising that people who are involved in Gamergate stuff would know about that, since a lot of them are probably involved in the streaming game too, where knowing SEO becomes relevant to income.

It's a human made structure, without the key component that makes us non-psychopaths on average.
You're making the same argument as someone who would claim there's such a thing as objective science. There isn't, there never was, there never will be. You need to skew for relevant, peer-reviewed, trustworthy results. Otherwise you get junk. Which is what these results are.


It's also not an actual reflection of likely real human search behavior. A person asking a moral question isn't looking for a result dictated by 999 fake results versus 1 real one.
I mean, how often do you believe a real human would type in "are Jews evil" as an actual, genuinely motivated, human search on the worldwide average compared to internal search behavior from one relevant topic (say Judaism main wikipedia page) to another (say the wikipedia page on the Thora and other scripture). You know what a factor analysis is in statistics, then you know exactly what I'm referring to.

SEO and search engine results DO NOT, in any way or form, actually accurately reflect human behavior. Don't weigh all humans by the idiocy of the technology created by few. You know, like Marc Zuckerberg being an idiot being used as a metric of all people.
 

Ether_Snake

安安安安安安安安安安安安安安安
It's pretty easy for Google to fix, especially with neural networks. They aren't fixing it for whatever reason.
 

FoxSpirit

Junior Member
And it's localized to boot. I can't get US google results because I'm in Austria and typing in "jews " or "Juden " results in 0 suggestions. They know they could probably get into deep shit with the german government about such search results.

Back in the day google would give me interesting hits on obscure websites. Nowadays it seems like a shitfest with linkspam sites which have nothing else than reurgitating press releases. Interesting results on page 12.
 
Google as a search engine doesn't have a whole lot of responsibility IMO.

Google as a question-answering machine definitely does though. Especially when it highlights an answer at the top of the page -- it definitely has a responsibility to get that right.

They also better be careful with their assistant. That has a responsibility to be right as well.
 
People actually type questions into search engines? Might as well jerk off as well, cause the results are going to be loaded.
 

Biff

Member
Even with these ridiculous suggestions, Google Search is still a massive, MASSIVE net-positive for humanity and the advancement of knowledge. So yeah, some of those examples are bad, but I'm not going all pitchfork on Google anytime soon.

Now, while I don't expect Google to screen through millions of potential questions, I think it's a pretty low bar to expect them to hire a few interns to weed out some of the more problematic auto-fill responses and ensuing front-page suggested answers.

I don't expect the neural network to be a fully-functioning ethical being anytime soon. In the meantime I think it's pretty reasonable to expect a small level of human interference to block the more egregious examples discussed in the article.
 

kswiston

Member
The first suggestion for "are black people" is "are black people real" on google.ca where I am.....

The only suggestion for "are white people" is "are white people evil"
 
The first suggestion for "are black people" is "are black people real" on google.ca where I am.....

The only suggestion for "are white people" is "are white people evil"

Black people are a conspiracy made up by the jews to take Zwarte Peit from us. Don't be fooled.
 
Why would you search "Do black people commit more crimes?" are you genuinely curious? Or are your really looking to confirm your bias?

I often type questions like that to refute what I see friends post on Facebook. "89% of white homicide victims are killed by blacks" graphic, etc. Wait, what? That can't be true. Type horrible lie into Google, contribute to the problem.
 
I understand the problem of "there are untruths on the internet and innocents are being bamboozled", but is the answer to that anything more then to teach individuals some healthy skepticism and how to determine whether a source is credible or not?

The more popular answer lately is asking internet companies and/or the government to regulate the internet on a massive scale in order to create the desired context. How do we do this in an effective manner that does not lead to gross abuse of such power? Or is the problem so dire that the risks are worth it as long as this problem is solved?

I'm honestely not that opposed to it myself. Imagine an Internet that cracks down on harrasment, fear mongering, and hateful views. Would be nice, as long as we get it right the first time and it never falls into the hands of people with irrational and unethical beliefs.
 

IrishNinja

Member
jesus christ if your google lists anything pointing at bodybuilding forums you should just ron swanson your PC with a quickness
 

Totakeke

Member
jesus christ if your google lists anything pointing at bodybuilding forums you should just ron swanson your PC with a quickness

ZCJFoyi.png


I9JqvGf.png


/shrug


I believe Google did take down the "did black people commit more crimes" suggestion though. It was working this morning. Maybe they read the article and decided to react?
 

Jebusman

Banned
ZCJFoyi.png


I9JqvGf.png


/shrug


I believe Google did take down the "did black people commit more crimes" suggestion though. It was working this morning. Maybe they read the article and decided to react?

I dunno man, "Diggerfortruth.wordpress.com" sounds like a legit source. Don't want to mess with that.
 

heidern

Junior Member
Option 1) Amoral search algorithms select search results.
Option 2) The people's elected representative (Donald Trump) gives moral guidelines to be integrated into search algorithms.
 
I wonder if anyone realizes how PageRank actually works.

Those search results are at the top because many people go to those sites. The problem isn't the search engine, it's humanity being terrible. And there isn't any solution for that.

Unless you want Google to engage in massive unprecedented scales of censorship, the results are what they are. I mean, that's how Baidu works in China, the government tells Baidu what search results are appropriate there.
 
Google cannot deny that they have a responsibility to fix this issue when they are suggesting results to users.

The fact that if I start typing any derragatory term Google knows not to suggest something but it is not willing to fix the issue of suggesting hateful searches.

Google's 6th philosphy is: You can make money without doing evil.

I would argue that the algorithim is doing evil.

“This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”

Amazing quote from the OP
 

Ether_Snake

安安安安安安安安安安安安安安安
Because it would fuck up their entire targeted advertising scheme.

No it wouldn't, it would improve it.

I wonder if anyone realizes how PageRank actually works.

Those search results are at the top because many people go to those sites. The problem isn't the search engine, it's humanity being terrible. And there isn't any solution for that.

Unless you want Google to engage in massive unprecedented scales of censorship, the results are what they are. I mean, that's how Baidu works in China, the government tells Baidu what search results are appropriate there.

That's not how it works, otherwise the very first site it would have ever suggested would be at the top because it would just reinforce itself as being the top visited site. The search results are more complex than that, but not pertinent enough per user, and that's the problem. With neural networks Google will eventually have a better idea of why you are searching for something even if it knows nothing about you personally, not "oh here's some site that has those words and is visited a lot, hope it helps!".
 
Status
Not open for further replies.
Top Bottom