CogDogBlog: Google Image Search Tilts Towards The Photo Scraper Sites

CogDogBlog: Google Image Search Tilts Towards The Photo Scraper Sites. “In theory, their motto is about not doing evil, but Google sure does some smelly things that we can only guess at because of their opaqueness. Like giving favorable image search results to sites that obviously scrape content from other sites and serve it up as shoddy copies.”

The Register: Transparent algorithms? Here’s why that’s a bad idea, Google tells MPs

The Register: Transparent algorithms? Here’s why that’s a bad idea, Google tells MPs . “Opening up the processes that underpin algorithms may well magnify the risk of hacking, widen privacy concerns and stifle innovation, Google has told MPs. The comments came in Google’s response to the House of Commons Science and Technology Committee’s inquiry into algorithmic decision-making, which is questioning whether organisations should be more open about how machines influence such choices.”

EurekAlert: Researchers unveil tool to debug ‘black box’ deep learning algorithms

Eurekalert: Researchers unveil tool to debug ‘black box’ deep learning algorithms. “Deep learning systems do not explain how they make their decisions, and that makes them hard to trust. In a new approach to the problem, researchers at Columbia and Lehigh universities have come up with a way to automatically error-check the thousands to millions of neurons in a deep learning neural network. Their tool, DeepXplore, feeds confusing, real-world inputs into the network to expose rare instances of flawed reasoning by clusters of neurons. Researchers present it on Oct. 29 at ACM’s Symposium on Operating Systems Principles in Shanghai.”

Niemanlab: Are you a low-quality web page? (Are you sure?) Facebook sheds a little light on its algorithm

Niemanlab: Are you a low-quality web page? (Are you sure?) Facebook sheds a little light on its algorithm . “Facebook … went ahead on Tuesday and, at an event at CUNY in New York, released a set of ‘News Feed Publisher Guidelines’ that aim to decipher the News Feed algorithm a bit and help publishers better understand ‘content guidelines, quality guidelines, and community standards to aid your efforts to find and engage your audience on News Feed.’ (In other words, do this or we’ll bump your stuff to the secondary feed. Haha! Just kidding.) The panel included a discussion between Facebook News Feed VP Adam Mosseri and CUNY professor Jeff Jarvis.”

Quartz: Silicon Valley has designed algorithms to reflect your biases, not disrupt them

Quartz: Silicon Valley has designed algorithms to reflect your biases, not disrupt them. “Silicon Valley dominates the internet—and that prevents us from learning more deeply about other people, cultures, and places. To support richer understandings of one another across our differences, we need to redesign social media networks and search systems to better represent diverse cultural and political perspectives.”

Medium: Algorithmic Consumer Protection

Medium: Algorithmic Consumer Protection. “This March, Facebook announced a remarkable initiative that detects people who are most at risk of suicide and directs support to them from friends and professionals. As society entrusts our safety and well-being to AI systems like this one, how can we ensure that the outcomes are beneficial?”