Bing automatically scans (or “crawls”) the internet to develop and maintain an index to generate and display a set of search results (the “displayed search results”). The index is really a catalog of available online resources, including websites, images, videos, documents, and other items. Particular displayed search results are created by using a computer algorithm to match the search terms you enter with results in our index. In general, we try to provide as comprehensive and as useful a collection of displayed search results as we can. We design algorithms to provide the most relevant and useful results and determine which displayed search results appear for any given search.
Bing doesn’t control the operation or design of the websites we index. We also don’t control what these websites publish. As long as the website continues to make the information available on the web, the information will be generally available to others through Bing or other search services.
In limited cases, where relevant laws and/or public policy concerns address issues such as privacy, intellectual property protection, and the protection of children, we might remove a displayed search result of a particular resource. In each case, we try to limit our removal of displayed search results to a narrow set of circumstances so that we don’t overly restrict access of Bing users to relevant information.
Here are the ways that Bing does this and when.
How we help stop the distribution of child abuse content
Sadly, the abuse of children is not new, but the internet affords a number of new opportunities to those who would commit crimes against children, including trafficking in images of sexual abuse. Bing works with law enforcement and other authorities to help stop the flow of this content online. One of the ways that we do this is by removing displayed search results that have been reviewed by credible agencies and found to contain or relate to the abuse of children.
In particular, we remove from our displayed search results links that have been identified by either the Internet Watch Foundation (UK), NCMEC (USA), FSM (Germany) as, in their good faith judgment, hosting or providing access to child abuse content that is illegal under the laws of their jurisdiction. Removing these links from displayed search results doesn’t block them from being accessed on the internet or discovered through means other than Bing, but it does reduce the ability of those who would seek out child abuse content to find it and reduces the extent to which sellers of such content can profit from it.
We remove these types of displayed search results only when we’re confident that the government or quasi-governmental agency providing the links:
- Is a credible and accountable organization.
- Limits the scope of its work to illegal child abuse content.
- Provides some measure of recourse (like the ability to appeal) if content or sites hosting such content are blocked incorrectly.
How we protect intellectual property
In many countries, including the United States, search providers are obligated to respond to claims from rights holders about unauthorized posting, distribution, or other publication of protected content. The international community recognizes that such unauthorized publication can infringe on the rights of content owners and has fashioned both international treaties and local laws to address the matter. Pursuant to these laws, and in support of our own policies encouraging respect for intellectual property, we might remove certain displayed search results from our index upon notice from rights holders.
Bing recognizes that the rights of content owners exist alongside the rights of users and that creativity and innovation online should be encouraged. To this end, Bing has helped develop a set of principles with respect to user-generated content applications (some of which generate links that we catalog in our search service). Learn more about those principles athttp://www.ugcprinciples.com. We also review counter-notices that comply with the Digital Millennium Copyright Act sent from parties who wish to object to the removal of their content.
How we address allegations of libel or defamation
Similarly, countries around the world have adopted laws and procedures to address defamation, libel, slander, and other harms related to false statements that are made or implied to be fact and which might yield a negative perception about an individual, business, or other organization. We may remove displayed search results containing allegedly defamatory content. For example, we might remove a displayed search result if we receive a valid and narrow court order indicating that a particular link has been found to be defamatory.
How we work to prevent the invasion of privacy
From time to time, webpages that are publicly available will intentionally or inadvertently contain private information that is posted without the consent of the individual identified or in circumstances that create security or privacy risks. Examples include inadvertent posting of public records, private phone numbers, identification numbers and the like, or intentional posting of email passwords, login credentials, credit card numbers, or other data that is intended to be used for fraud or hacking.
Bing doesn’t control the sites that publish this information or what they publish. Most of the time the website is in the best position to address any privacy concerns about the information it publishes. As long as the website continues to make the information available on the web, the information will be available to others. Once the website has removed the information and we have crawled the site again, it will no longer appear in our results.
If the information has already been removed from that website but is still showing up in Bing displayed search results, you can use the Content Removal Tool to submit a page removal or outdated cache removal request. To learn more about the Content Removal Tool, go to Bing Webmaster Help & How-To.
How we address web spam
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such displayed search results, or adjust our algorithms to prioritize more useful and relevant pages in displayed search result sets.
How we address laws specific to individual countries
Some countries maintain laws or regulations that apply to search service providers that require that we remove access to certain information that Bing has indexed, primarily for geopolitical purposes or local cultural norms and sensibilities. We must integrate our support for freedom of access to information by people of all countries with required compliance that allows us to offer the search services in a specific jurisdiction. When approached with a request for removal of displayed search results by a governmental entity, we require proof of the applicable law and authority of the government agency, and an official request to require removal. If such proof is provided and we can verify it, then we may comply with the removal request. If we are required to implement the request, we will do so narrowly. If the removal request is inconsistent with international standards, we might choose to seek clarification to further narrow our obligation to comply.
How we address issues related to access to adult content
Bing offers SafeSearch settings, which allow most users to set the type of filtering of adult content that they would like applied to their search results. By default, in most markets all searches are set to moderate, which restricts visually explicit search results but does not restrict explicit text. Because of local customs or cultural norms, certain countries may impose legal restrictions on the display of adult content. As a result, what constitutes adult content might vary depending on the market.
Bing categorizes certain countries as strict markets. In these strict markets, we might restrict the display of adult content (as locally defined), and because of the local customs, norms, and laws, we might limit SafeSearch settings only to “strict.” Set to “strict,” SafeSearch filters the display of explicit search results in images, videos, and text. Markets that are limited to “strict” include:
- Middle East