1、EU Digital ServicesAct(EU DSA)BiannualVLOSE/VLOPTransparency ReportThe European Union(EU)Digital Services Act(DSA)came into force on 16November 2022.We welcome the DSAs goals of making the internet even moresafe,transparent and accountable,while ensuring that everyone in the EUcontinues to bene?t fr
2、om the open web.Google is commi?ed to promotingtransparency for the users of our pla?orms.Published:27 October 2023EU DSA Biannual VLOSE/VLOP Transparency ReportThe European Union(EU)Digital Services Act(DSA)came into force on 16 November 2022.Google haslong been aligned with the broad goals of the
3、DSA and has devoted signi?cant resources into tailoringour programs to meet its speci?c requirements.We welcome the DSAs goals of making the interneteven safer,more transparent and more accountable,while ensuring that everyone in the EU continues tobene?t from the open web.In accordance with Article
4、s 15,24,and 42 of the DSA,Google is publishing biannual transparencyreports for its services designated by the European Commission as a Very Large Online Search Engine(VLOSE)or a Very Large Online Pla?orm(VLOP):Google Search,Google Maps,Google Play,Shoppingand YouTube.This report describes Googles e
5、?orts and resources to moderate content on the services listed abovein the EU during the period from 28 August 2023 to 10 September 20231.OverviewSince Google was founded,our mission has been to organise the worlds information and make ituniversally accessible and useful.When it comes to the informa
6、tion and content on our pla?orms,wetake seriously our responsibility to safeguard the people and businesses using our products,and do sowith clear and transparent policies and processes.As such,our product,policy,and enforcement decisions are guided by a set of principles which enableus to preserve
7、freedom of expression,while curbing the spread of content that is damaging to users andsociety.1.We value openness and accessibility:We lean towards keeping content accessible byproviding access to an open and diverse information ecosystem.2.We respect user choice:If users search for content that is
8、 not illegal or prohibited by ourpolicies,they should be able to?nd it.3.We build for everyone:Our services are used around the world by users from di?erentcultures,languages,and backgrounds,and at di?erent stages in their lives.We take the diversityof our users into account in policy development an
9、d policy enforcement decisions.These principles are addressed in three key ways,providing our users with access to trustworthyinformation and content.First,we protect users from harm through built-in advanced protections,policies,and a combinationof scaled technology and specially trained human revi
10、ewers.These mechanisms enable us to prevent1Due to the short time between DSA applicability date and report deadline,and the time required to conduct datavalidation,it was only feasible to include a 2 week reporting period for this?rst report.Future reports will providemetrics collected over a longe
11、r reporting period.EU DSA Report 1distribution of harmful and illegal content before it reaches users;detect and evaluate potentiallyviolative content;and respond to bad actors and abusive content in an appropriate way.Second,throughour ranking and recommendation systems,we deliver reliable informat
12、ion to users,as well as providetools to help users evaluate content themselves,giving them added context and con?dence in whatthey?nd on our products and services,and across the internet.Third,we partner to create a saferinternet and scale our impact,collaborating with experts,governments,and organi
13、sations to inform ourtools and share our technologies.Helpful,safe online environments do not just happen they are designed.At Google,we aim to balanceaccess to information with protecting users and society,while providing information and content userscan trust.In this report,we outline and provide
14、metrics contemplated by the DSA regarding our e?orts andresources to moderate potentially illegal content and policy-violative content in the EU.We arecommi?ed to improving and augmenting future iterations with further insights about our continuede?orts to combat violative content on our pla?orms.EU
15、 DSA Report 2Section 1:Article 9 and 10 Orders from Member StatesauthoritiesArticle 15(1),point(a)Courts and government agencies in the EU regularly request that we remove information from Googleservices(Removal Orders).These requests are routed to the appropriate team(s)within Google whoreview thes
16、e requests closely to determine if information should be removed because it may violate alaw or our product policies.In addition,speci?c Member State laws allow government agencies in the EUto request user information for civil,administrative,criminal,and national security purposes(User DataDisclosu
17、re Orders).Each request is carefully reviewed to make sure it satis?es applicable laws.NoRemoval Orders or User Data Disclosure Orders conforming to the requirements of Articles 9 and 10 ofthe DSA were received during the reporting period.Information about other requests from government authorities
18、around the world are published in ourGovernment Requests for Content Removal Transparency Report and our Government Requests forUser Information Transparency Report2.2Information in these reports is voluntarily provided and not necessarily directly comparable with informationpresented in this mandat
19、ed DSA report,due to di?erences in methodologies.EU DSA Report 3Section 2:Notices received through notice and actionmechanismsArticle 15(1),point(b)Googles content and product policies apply wherever you are in the world,but we also have processesin place to remove or restrict access to content base
20、d on local laws.Users,Trusted Flaggers(as de?nedby Article 22)and other entities can report content that they would like to be removed from Googlesservices under applicable laws.Action is taken on content that is deemed to violate applicable laws orGoogle policies.2.1 Number of notices submi?ed in a
21、ccordance with Article 16,brokendown by type of alleged illegal content concernedArticle 15(1),point(b)Table 2.1.1 re?ects the number of notices submi?ed by EU-based users and other entities in accordancewith Article 16 during the reporting period,broken down by type of alleged illegal content and s
22、ervice.Table 2.1.1:Number of Article 16 notices submi?ed,by type of alleged illegalcontent and serviceEU DSA Report 4Type of allegedillegal contentNumber of Article 16 noticesMapsPlayShoppingYouTubeMulti-Services1Child Sexual Abuseand Exploitation4002351Circumvention00060Copyright36791736,568112Coun
23、terfeit0164297Defamation14,627641,7682Hate andHarassment2007670Privacy235316360Trademark05616531712.2 Number of Article 16 notices submi?ed by Trusted FlaggersArticle 15(1),point(b)In the European Union,national entities called Digital Services Coordinators may award Trusted Flaggerstatus to entitie
24、s tasked with?agging allegedly illegal content on online pla?orms.Trusted Flaggers arelikely to have expertise in one or more?elds relevant to content moderation,such as privacy or childsafety.The European Commission will maintain a list of designated Trusted Flaggers in a publiclyaccessible databas
25、e.No Trusted Flagger status has been awarded at this time,and therefore no Article16 notices submi?ed by Trusted Flaggers were received during the reporting period.2.3 Number of actions taken in response to Article 16 notices,brokendown by actions based on legal grounds and actions based on policygr
26、oundsArticle 15(1),point(b)Legal standards vary greatly by country/region.Content that violates a speci?c law in one country/regionmay be legal in others.Typically,Google removes or restricts access to content only in thecountry/region where it is deemed to be illegal.However,when content is found t
27、o violate Googlescontent or product policies or Terms of Service,Google may remove or restrict access globally.When a legal notice is reviewed and the content violates our content policies,action may be taken onpolicy grounds.If the content does not violate our policies,Google may take action on leg
28、al grounds,inline with local laws(see Table 2.3.1 for breakdown by service).As a legal notice may contain one or moreURLs for review,multiple actions may be taken as a result of a single notice received.EU DSA Report 5Type of allegedillegal contentNumber of Article 16 noticesMapsPlayShoppingYouTubeM
29、ulti-Services1Violent Extremism0001800Other Legal21084832Total14,9061462942,090325Notes:1Notices relating to advertisements that may have appeared across multiple Google services,including VLOPs,are included under Multi-Services.Table 2.3.1:Number of actions taken in response to Article 16 notices,b
30、yservice and basis of the action1ServiceActions taken on legal groundsActions taken on policy groundsMaps25,0771,191Play3657Shopping14161YouTube32,5225Multi-Services210796Notes:1More than one action can be taken on an Article 16 notice.2Notices relating to advertisements that may appear across multi
31、ple Google services,including VLOPs,areincluded under Multi-Services.2.4 Number of Article 16 notices processed by automated meansArticle 15(1),point(b)During the reporting period,YouTube processed 20,157 Article 16 notices by automated means(i.e.,with no human involvement).Article 16 notices are no
32、t processed by automated means for any of theother VLOPs.2.5 Median time needed to take action on content identi?ed in Article 16noticesArticle 15(1),point(b)Table 2.5.1 re?ects the median time,in days,needed to take action on content identi?ed in Article 16notices for each service.EU DSA Report 6Ta
33、ble 2.5.1:Median time to take action on Article 16 notices,by serviceServiceMedian time to take action(days)Maps6Play1Shopping3YouTube1Multi-Services199.99%of all fully automated removal decisions on Web Search that impacted users based inthe EU were unchanged,while 0.01%were reinstated as a result
34、of a counter notice.Automated Tools used to combat Child Sexual Abuse Material(CSAM)Google takes its responsibility to?ght child sexual abuse and exploitation online very seriously.We dothis by comba?ing CSAM across Googles products and by detecting instances of abuse and enforcingrobust policies.We
35、 also partner with non-governmental organisations(NGOs)and others in industry toshare proprietary technology and drive the industry forward.Built-in protections help prevent Google products from showing abusive content and deter bad actors.For example,Google deploys safety by design principles to de
36、ter users from seeking out CSAM onGoogle Search.It is our policy to block search results that lead to child sexual abuse imagery or materialthat appears to sexually victimise,endanger or otherwise exploit children.We are constantly updatingour algorithms to combat these evolving threats.We apply ext
37、ra protections to searches that werecognise as seeking CSAM content.We?lter out explicit sexual results if the search query seems to beseeking CSAM.For queries seeking adult explicit content,Google Search wont return imagery thatincludes children,to break the association between children and sexual
38、content.In many countries,users who enter queries clearly related to CSAM are shown a prominent warning that child sexual abuseimagery is illegal,with information on how to report this content to trusted organisations.When thesewarnings are shown,we have found that users are less likely to continue
39、looking for this material.In order to detect and report CSAM,we may use a combination of cu?ing-edge technology,includingmachine learning classi?ers(to identify unknown CSAM)and hash-matching technology,as well astrained specialist teams.Hash-matching technology creates a“hash”,or unique digital?nge
40、rprint,for animage or a video so it can be compared with hashes of known CSAM.When Google?nds CSAM,ourservices remove it,report it to the National Center for Missing and Exploited Children(NCMEC),andtake action,which may include disabling the account.Google scales its impact by collaborating with NC
41、MEC and partnering with NGOs and industrycoalitions to help grow and contribute to a joint understanding of the evolving nature of child sexualabuse and exploitation.One of the ways Google contributes is by creating and sharing free tools to helpother organisations prioritise potential CSAM images f
42、or human review.For example,Googles ChildSafety Toolkit consists of two APIs.The?rst is CSAI Match,an API developed by YouTube that partnerscan use to automatically detect known videos of CSAM so they can?ag for review,con?rm,report,andact on it.The second is Googles Content Safety API that helps pa
43、rtners classify and prioritise novelpotentially abusive images and videos for review.Detection of never-before-seen CSAM helps the childEU DSA Report 22safety ecosystem by identifying child victims in need of safeguarding and contributing to the list ofknown digital?ngerprints to grow our abilities
44、to detect known CSAM.Google takes action not just on illegal CSAM,but also wider content that promotes the sexual abuse andexploitation of children and can put children at risk.Automated tools that a?ect advertisementsAdvertisements can appear across multiple VLOP and VLOSE services.To keep ads safe
45、 andappropriate for everyone,ads are reviewed to make sure they comply with Google Ads policies.Google uses a combination of automated and human evaluation to detect and remove ads which violateour policies and are harmful to users and the overall ecosystem.Our enforcement technologies may useautoma
46、ted evaluation,modelled on human reviewers decisions,to help protect our users and keep ourad pla?orms safe.The policy-violating content is either removed by automated means or,where a morenuanced determination is required,it is?agged for further review by trained operators and analysts whoconduct c
47、ontent evaluations that might be di?cult for algorithms to perform alone,for examplebecause an understanding of the context of the ad is required.The results of these manual reviews arethen used to help build training data to further improve our machine learning models.When reviewing ad content or a
48、dvertiser accounts to determine whether they violate our policies,Google takes various information into consideration when making a decision,including the content ofthe creative(e.g.ad text,keywords,and any images and video)as well as the associated ad destination.Google also considers account infor
49、mation(e.g.,past history of policy violations)and other informationprovided through reporting mechanisms(where applicable)in our investigation.During the reporting period,2%of Googles fully automated enforcement decisions on ads placed byadvertisers in the EU were overturned a?er subsequently underg
50、oing human review.3.2.2 Google SearchGoogle Search relies on a combination of people and technology to enforce Google Search policies.Machine learning,for example,plays a critical role in content quality on Google Search.Google Searchsystems are built to identify and balance signals of authoritative