Blogs on google tools.

  • A recipe for beating the record of most-calculated digits of pi

    Editor’s note: Today, March 14, is Pi Day (3.14). Here at Google, we’re celebrating the day with a new milestone: A team at Google has broken the Guinness World RecordsTMtitle for most accurate value of pi.Whether or not you realize it, pi is everywhere you look. It’s the ratio of the circumference of a circle to its diameter, so the next time you check your watch or see the turning wheels of a vehicle go by, you’re looking at pi. And since pi is an irrational number, there’s no end to how many of its digits can be calculated. You might know it as 3.14, but math and science pros are constantly working to calculate more and more digits of pi, so they can test supercomputers (and have a bit of healthy competition, too).While I’ve been busy thinking about which flavor of pie I’m going to enjoy later today, Googler Emma Haruka Iwao has been busy using Google Compute Engine, powered by Google Cloud, to calculate the most accurate value of pi—ever. That’s 31,415,926,535,897 digits, to be exact. Emma used the power of the cloud for the task, making this the first time the cloud has been used for a pi calculation of this magnitude.Here’s Emma’s recipe for what started out as a pie-in-the-sky idea to break a Guinness World Records title:Step 1: Find inspiration for your calculation.When Emma was 12 years old, she became fascinated with pi. “Pi seems simple—it starts with 3.14. When I was a kid, I downloaded a program to calculate pi on my computer,” she says. “At the time, the world record holders were Yasumasa Kanada and Daisuke Takahashi, who are Japanese, so it was really relatable for me growing up in Japan.”Later on, when Emma was in college, one of her professors was Dr. Daisuke Takahashi, then the record holder for calculating the most accurate value of pi using a supercomputer. “When I told him I was going to start this project, he shared his advice and some technical strategies with me.”Step 2: Combine your ingredients.To calculate pi, Emma used an application called y-cruncher on 25 Google Cloud virtual machines. “The biggest challenge with pi is that it requires a lot of storage and memory to calculate,” Emma says. Her calculation required 170 terabytes of data to complete—that's roughly equivalent to the amount of data in the entire Library of Congress print collections.Step 3: Bake for four months.Emma’s calculation took the virtual machines about 121 days to complete. During that whole time, the Google Cloud infrastructure kept the servers going. If there’d been any failures or interruptions, it would’ve disrupted the calculation. When Emma checked to see if her end result was correct, she felt relieved when the number checked out. “I started to realize it was an exciting accomplishment for my team,” she says.Step 4: Share a slice of your achievement.Emma thinks there are a lot of mathematical problems out there to solve, and we’re just at the beginning of exploring how cloud computing can play a role. “When I was a kid, I didn’t have access to supercomputers. But even if you don’t work for Google, you can apply for various scholarships and programs to access computing resources,” she says. “I was very fortunate that there were Japanese world record holders that I could relate to. I’m really happy to be one of the few women in computer science holding the record, and I hope I can show more people who want to work in the industry what’s possible.”At Google, Emma is a Cloud Developer Advocate, focused on high-performance computing and programming language communities. Her job is to work directly with developers, helping them to do more with the cloud and share information about how products work. And now, she’s also sharing her calculations: Google Cloud has published the computed digits entirely as disk snapshots, so they’re available to anyone who wants to access them. This means anyone can copy the snapshots, work on the results and use the computation resources in less than an hour. Without the cloud, the only way someone could access such a large dataset would be to ship physical hard drives. Today, though, Emma and her team are taking a moment to celebrate the new world record. And maybe a piece of pie, too. Emma’s favorite flavor? “I like apple pie—not too sweet.”For the technical details on how Emma used Google Compute Engine to calculate pi, head over to the Google Cloud Platform blog. […]

  • Bringing new voices, and communities, to the world of podcasts

    Editor’s note: The Google Podcasts creator program, run by PRX, provides 20 weeks of training, mentorship, and seed funding to promising podcasters, with the aim of promoting underrepresented voices throughout the industry and around the world. Applications for the next round are currently open and will be accepted until 11:59pm ET, Sunday, April 14.Catalina May and Martín Cruz, the team behind “Las Raras” (“The Outsiders”), are independent podcasters based in Santiago, Chile. They are one of the six teams participating in the first round of the program. Their training began in January 2019 with a week-long intensive “bootcamp” at the PRX Podcast Garage in Boston and will culminate in a final showcase on June 19 in Boston.A few days ago, while reporting about Chilean countrymen fighting for their land, we contacted a source to request an interview. Our source asked us where the story would appear.“We have an independent podcast,” we explained. “It’s called ‘Las Raras,’ and we are part of the Google Podcasts creator program.”Then, silence. For many seconds. We knew what we had to ask.“Do you know what a podcast is?”She did not.This is a very common situation for us. Not only in Chile where we live, but also in Latin America broadly, many people haven’t even heard the word podcast, much less listened to one.Cruz and May at the Google Podcasts creator program.When we arrived in Boston for the Google Podcasts creator program bootcamp, we experienced the total opposite of our situation in Santiago. At the PRX Podcast Garage, we met amazing trainers and the five other teams in the program. It was a dream to talk about podcasting for twelve hours a day with a diverse group of people who share our passion.Through the other teams, we learned how many different goals a podcast might have: raise awareness about LGBTQ+ people’s lives in a place where homosexuality is illegal (“AfroQueer”), tell stories from the Filipino diaspora (“Long Distance”), reflect on modern beauty standards (“The Colored Girl Beautiful”), introduce children to Puerto Rican history (“Timestorm”), or talk about car culture and road rage (“Who Taught You How to Drive?!”).We created our podcast, “Las Raras,” in 2015, inspired (like many) by the first season of “Serial.” As a journalist and a sound engineer, we heard right away what podcasts could do. This intimate medium is perfect for telling stories of people who are frequently overlooked, stories of people challenging norms and stories of people defying the status quo.In other words, podcasts let us tell stories we feel passionate about and are not often heard in Chile. Right away, we loved the innovation and openness podcasting offered. But it was daunting, too. We had to learn a new way of interviewing and structuring stories, and how to use sound without visuals. There were almost no other podcasts in Chile at that time, too.It was also financially risky. Our first two seasons were self-financed. Luckily, for our third season, we got some support from the International Women’s Media Foundation. It helped, but we were still feeling quite alone and facing an uncertain road ahead. When we first heard about the Google Podcasts creator program, it seemed perfect for us, because its goal is to increase the diversity of voices in the podcasting industry.We put our hearts and souls into the application, making clear that we were at a critical time for our podcast. Two days before Christmas, we received an email confirming that we had been chosen. It took us a couple of weeks to really believe that it was happening. It’s an honor, but also a huge responsibility.We know the stories we want to tell, and the support of the Google Podcasts creator program will allow us to take “Las Raras” to the next level. Our goal in the program is to find a way to be successful on a long-term basis. With the support of Google, PRX, our mentors and our fellow podcasters, we are focusing our attention on better understanding the needs of our audience, and developing a sustainable business model for “Las Raras.” The program offers the training to improve the quality of the stories we love to tell, and those we know our audience wants to hear. […]

  • Driving change with Rolling Study Halls

    Editor’s note: We’re sharing the stories of bus drivers from Talladega County, Alabama who participate in Rolling Study Halls. This program powers their buses—along with others across the U.S.—with Wi-Fi, devices and onboard educators to help thousands of students reclaim 1.5 million learning hours. As part of our Grow with Google initiative to help provide more Americans with access to the tools and skills they need, we expanded the program across the country in 2018. In the past academic year, participating school districts across the U.S. have seen improvements in student grades, confidence and homework completion. Dr. Suzanne Lacey, Superintendent from Talladega County, is sharing more about the impact this program is having in her community and beyond.Talladega County is home to more than 7,000 students across 17 schools. The majority of our students spend a sizable part of their day on a bus getting to and from school. In our rural county, many students also face limited access to the internet—and it’s not just an economic issue. In a lot of places where our students live, there simply isn’t access available. For these reasons, Google’s Rolling Study Halls has become an important part of our educational program since we became a participating district last April. Through a creative use of commute time, we’re now able to open doors for these students to opportunities they might not have had otherwise. Our bus drivers aren’t just driving students to school, they’re also helping to drive change that goes beyond their daily routines. Maximizing opportunities for learningKim Gaither, who drives a Rolling Study Halls bus for Munford Elementary and High School, has said the program dramatically improves her long bus route, which is now quieter due to better student behavior. While Kim’s focusing on the road, the kids get to focus on getting more out of their time on the bus. By extending the time available for learning day, everyone benefits. Principal Michelle Head says Stemley Road Elementary teachers have seen student confidence grow, which she and her teachers attribute to Rolling Study Halls.Creating time to connectRolling Study Halls also fosters relationships between the onboard educator and their students. Drew Middle School teacher and onboard educator, Stuart Bently, recently shared with me the story of a former 7th grade student who struggled in class and rarely completed her work. On the bus, Stuart can give this student extra attention and have conversations about not just her assignments, but also about what’s going on in her life. He’s proud that this student is now completing assignments, participating in class and couldn’t wait to show him her last report card.Onboard educator and 2nd grade teacher Jessica Moses provides targeted warm-up activities for students on her bus each morning to get students into the right frame of mind before getting to school. These activities are a rare opportunity to have a positive impact on students' learning attitudes before they even walk through their classroom door, setting the tone for their day.Driving changeOur entire community—parents, teachers and bus drivers—are all eager to see how this program will continue to positively impact our students. And we’re inspired to know that what’s working here is also working in other Rolling Study Halls school districts. In Tennessee, the Clarksville-Montgomery County School District has seen improvements in their students’ GPAs at both the middle and high school levels. Teachers have said students are now more comfortable in asking for help with assignments or standards they don’t understand. The same goes for Lexington County School District One in South Carolina, where 57 percent of participating students say that the Wi-Fi connection on the bus is better than at home and 83 percent say the time on the bus is critical to helping them finish their homework.Back home, as we continue to measure success, we’ll also investigate methods for expansion. Talladega County Schools always looks for opportunities to maximize learning for our students. Together with Google, we’re making a difference for them, and we’re especially thankful for our bus drivers who are behind the wheel, making this whole thing run. […]

  • Enabling a safe digital advertising ecosystem

    Google has a crucial stake in a healthy and sustainable digital advertising ecosystem—something we've worked to enable for nearly 20 years. Every day, we invest significant team hours and technological resources in protecting the users, advertisers and publishers that make the internet so useful. And every year, we share key actions and data about our efforts to keep the ecosystem safe by enforcing our policies across platforms.Dozens of new ads policies to take down billions of bad adsIn 2018, we faced new challenges in areas where online advertising could be used to scam or defraud users offline. For example, we created a new policy banning ads from for-profit bail bond providers because we saw evidence that this sector was taking advantage of vulnerable communities. Similarly, when we saw a rise in ads promoting deceptive experiences to users seeking addiction treatment services, we consulted with experts and restricted advertising to certified organizations. In all, we introduced 31 new ads policies in 2018 to address abuses in areas including third-party tech support, ticket resellers, cryptocurrency and local services such as garage door repairmen, bail bonds and addiction treatment facilities.We took down 2.3 billion bad ads in 2018 for violations of both new and existing policies, including nearly 207,000 ads for ticket resellers, over 531,000 ads for bail bonds and approximately 58.8 million phishing ads. Overall, that’s more than six million bad ads, every day.As we continue to protect users from bad ads, we’re also working to make it easier for advertisers to ensure their creatives are policy compliant. Similar to our AdSense Policy Center, next month we’ll launch a new Policy manager in Google Ads that will give tips on common policy mistakes to help well-meaning advertisers and make it easier to create and launch compliant ads.Taking on bad actors with improved technologyLast year, we also made a concerted effort to go after the bad actors behind numerous bad ads, not just the ads themselves. Using improved machine learning technology, we were able to identify and terminate almost one million bad advertiser accounts, nearly double the amount we terminated in 2017. When we take action at the account level, it helps to address the root cause of bad ads and better protect our users.In 2017, we launched new technology that allows for more granular removal of ads from websites when only a small number of pages on a site are violating our policies. In 2018, we launched 330 detection classifiers to help us better detect "badness" at the page level—that's nearly three times the number of classifiers we launched in 2017. So while we terminated nearly 734,000 publishers and app developers from our ad network, and removed ads completely from nearly 1.5 million apps, we were also able to take more granular action by taking ads off of nearly 28 million pages that violated our publisher policies. We use a combination of manual reviews and machine learning to catch these kinds of violations.Addressing key challenges within the digital ads ecosystemFrom reports of “fake news” sites, to questions about who is purchasing political ads, to massive ad fraud operations, there are fundamental concerns about the role of online advertising in society. Last year, we launched a new policy for election ads in the U.S. ahead of the 2018 midterm elections. We verified nearly 143,000 election ads in the U.S. and launched a new political ads transparency report that gives more information about who bought election ads. And in 2019, we’re launching similar tools ahead of elections in the EU and India.We also continued to tackle the challenge of misinformation and low-quality sites, using several different policies to ensure our ads are supporting legitimate, high-quality publishers. In 2018, we removed ads from approximately 1.2 million pages, more than 22,000 apps, and nearly 15,000 sites across our ad network for violations of policies directed at misrepresentative, hateful or other low-quality content. More specifically, we removed ads from almost 74,000 pages for violating our “dangerous or derogatory” content policy, and took down approximately 190,000 ads for violating this policy. This policy includes a prohibition on hate speech and protects our users, advertisers and publishers from hateful content across platforms.  How we took down one of the biggest ad fraud operations ever in 2018In 2018, we worked closely with cybersecurity firm White Ops, the FBI, and others in the industry to take down one of the largest and most complex international ad fraud operations we’ve ever seen. Codenamed "3ve", the operation used sophisticated tactics aimed at exploiting data centers, computers infected with malware, spoofed fraudulent domains and fake websites. In aggregate, 3ve produced more than 10,000 spoofed, fraudulent domains, and generated over 3 billion daily bid requests at its peak.3ve tried to evade our enforcements, but we conducted a coordinated takedown of their infrastructure. We referred the case to the FBI, and late last year charges were announced against eight individuals for crimes including aggravated identity theft and money laundering. Learn more about 3ve and our work to take it down on our Security Blog, as well as through this white paper that we co-authored with White Ops.We will continue to tackle these issues because as new trends and online experiences emerge, so do new scams and bad actors. In 2019, our work to protect users and enable a safe advertising ecosystem that works well for legitimate advertisers and publishers continues to be a top priority. […]

  • Google Marketing Live broadcast on May 14th: register today

    Did you know that searches for "best” have increased by 80% in the last two years? For example, in recent third-party research, we saw that some people spend over 50 days searching for the “best chocolate” before making a decision.Google Marketing Live is happening on May 14th—register now to discover how to take action on new consumer insights like these and learn about the latest digital marketing products from Google. For the first time, we'll also be live streaming 8+ hours of additional content from the event. Engage directly with product managers through live Q&A, learn new best practices, and get an inside look at how our latest products are developed. […]

  • Grow your games business with ads

    There’s so much that goes into building a great mobile game. Building a thriving business on top of it? That’s next level. Today, we’re announcing new solutions to increase the lifetime value of your players. Now, it’s easier than ever to re-engage your audience and take advantage of a new, smarter approach to monetization.Help inactive players rediscover your gameLet's face it, the majority of players you acquire aren't going to continue engaging with your game after just a handful of days. One of the biggest opportunities you have to grow your business is to get those inactive players to come back and play again.We’re introducing App campaigns for engagement in Google Ads to help players rediscover your game by engaging them with relevant ads across Google’s properties. With App campaigns for engagement, you can reconnect with players in many different ways, such as encouraging lapsed players to complete the tutorial, introducing new features that have been added since a player’s last session, or getting someone to open the game for the first time on Android (which only Google can help with).Learn more about it here or talk to your Google account representative if you’re interested in trying it out.Generate revenue from non-spending playersAcquiring and retaining users is important, but retention alone doesn’t generate revenue.  Our internal data shows that, on average, less than four percent of players will ever spend on in-app items. One way to increase overall revenue is through ads. However, some developers worry that ads might hurt in-app purchase revenue by disrupting gameplay for players who do spend. What if you could just show ads to the players who aren't going to spend in your app? Good news—now you can.We’re bringing a new approach to monetization that combines ads and in-app purchases in one automated solution. Available today, new smart segmentation features in Google AdMob use machine learning to segment your players based on their likelihood to spend on in-app purchases.Ad units with smart segmentation will show ads only to users who are predicted not to spend on in-app purchases. Players who are predicted to spend will see no ads, and can simply continue playing.  Check it out by creating an interstitial ad unit with smart segmentation enabled.To learn more about news ways to help you increase the lifetime value of your players, please join us at the Game Developers Conference. Location and details are below:What: Google Ads KeynoteWhere: Moscone West, room #2020When: Wednesday March 20th at 12:30 PMI'm excited for the week ahead and all the new games you’re building—I’m always on the lookout for my next favorite. […]

  • Helping Indonesia prepare for disasters

    In September last year, a large earthquake struck the Indonesian island of Sulawesi. Within hours, a tsunami hit Palu, the provincial capital. Over two thousand lives were lost, making it the deadliest earthquake in 2018. and Googlers around the world responded by donating $1 million to support relief efforts led by Save the Children and the International Federation of Red Cross and Red Crescent Societies. We also rolled out our crisis response alerts and tools to provide emergency info those impacted.This earthquake was only one of more than 2,000 disasters to strike Indonesia last year. Altogether, the government has estimated that these disasters affected some three million people, causing billions of dollars in damages and a tragic loss of life. Unfortunately, 2018 was not an anomaly and we know that Indonesia will continue to be challenged by natural disasters. At, we look to help nonprofits on the frontlines of global crisis through funding and volunteers. But we also believe in supporting solutions that could help mitigate the impact of future crises.This is why we’re now helping Save the Children’s Indonesian partner, Yayasan Sayangi Tunas Cilik, with a $1 million grant. Through this grant, they’ll implement a national awareness campaign using online and offline platforms to ensure that schools are safe and children are better prepared for emergencies. It’s anticipated they’ll reach over half a million people, a majority of whom are women and children, some of the most vulnerable people in a time of crisis. Yayasan Sayangi Tunas Cilik will also engage in capacity building with local government bodies in order to improve coordination, planning and response for the Provincial and District level.Announcing a grant to Yayasan Sayangi Tunas Cilik in Jakarta. From left to right: Randy Jusuf (Google Indonesia), Rudiantara (Minister of Communication and Informatics of Indonesia); Jacquelline Fuller (, Selina Sumbung (Chairperson, Save the Children-Yayasan Sayangi Tunas Cilik), and Bambang Surya Putra (Directorate of Disaster Preparedness, National Disaster Management Agency) While disasters like the Sulawesi earthquake are unavoidable, I’m encouraged by the potential of what we can do together to ensure we’re as prepared as we can be. We hope that the learnings from this project will provide a strong framework to scale this work and contribute to long term sustainable disaster preparedness and awareness.&nbs […]

  • Helping Latino students learn to code

    Growing up in Costa Rica, I was always passionate about creating things and solving puzzles. That’s what drove me to computer science. I saw it as an opportunity to explore my interests and open doors to new possibilities. It's that love and passion that eventually helped me get to Google, and to the United States, where I now live.Computer science requires students to learn how to think in a totally new way. Getting into that mindset can be really hard for anyone, but it can be even tougher if you’re learning key phrases, concepts, and acronyms in an environment that feels different from your everyday life.That’s why I’m proud to share that is making a $5 million grant to UnidosUS, the YWCA and the Hispanic Heritage Foundation. The grant will bring computer science (CS) education to over one million Latino students and their families by 2022 with computer science curricula, including CS First, Google’s coding curriculum for elementary and middle school students. Additionally, it will support students' experience with how they learn about computer science, helping them explore CS and offering culturally relevant resources to engage parents.This $5 million grant is part of a new $25 million commitment in 2019 to increase Black and Latino students’ access to computer science (CS) and AI education across the US. This initiative will help these students develop the technical skills and confidence they need for the future, and help prepare them to succeed in the careers they pursue.Even as a fluent English speaker, I can’t count the number of times people misunderstand me because I pronounce things differently, or the times it takes me a little longer to understand because my day-to-day work language is not my primary language. This language barrier is not the only barrier—students from underrepresented communities, especially those who are Black and Latino, often don’t feel represented or connected to their first introduction to the field.While Black and Latino students have equal interest in CS education, they often face social barriers to learning CS, such as a lack of role models, and a lack of learning materials that reflect their lived experiences, like those that are in a language they understand. On top of these social barriers, these students often face structural barriers, such as not having equal access to learn CS in or outside of the classroom. Along with the grant, CS First is launching its first set of lessons in Spanish. In the first activity, "Animar un nombre," students choose the name of something or someone they care about and bring the letters to life using code. The second activity, "Un descubrimiento inusual,” encourages students to code a story about when two characters discover a surprising object.Today’s announcement is an exciting part of’s work to support students who have historically been underrepresented in computer science. These grants to partner organizations will help Black and Latino students access materials and engage with role models who feel connected to their culture. We will also help create more opportunities for students to access the courses they need to continue their studies.To me, the new Spanish coding lessons are more than just a fun way to learn coding. They are opportunities for entire communities of students to see themselves reflected in computer science education materials, perhaps for the first time. It’s our hope that students like the ones I met will use CS to create more inventions and opportunities for us all. […]

  • How I started traveling the world on my own, thanks to Google

    I took my first solo vacation in June 2017, and the experience changed my life. After countless searches of Norway’s beautiful landscapes on Google Images, I decided to make the trip there all by myself. The following year, I took my second solo trip, this time to New Zealand. After that, I knew I didn’t want to wait until my retirement to travel—I wanted to do this full-time. So I decided to quit my job this past June, exactly one year after my first solo trip, to make travel my career. It was one of the best decisions I have ever made.My post-corporate journey has been nothing short of thrilling. I toured Europe for 58 days, visiting 15 countries while making a living as a social media and branding consultant. Even though the experience has not always been easy (I got my phone stolen in Paris, for one), I’ve kept going, attending conferences and seeing the most scenic islands in the world.Skydiving in Switzerland.On my first solo trip, I became a Local Guide on Google Maps, writing reviews of restaurants in Norway from the perspective of a solo female traveler—and one who kept a strict vegetarian diet. Throughout my travels, I continued adding photos and reviews about my experiences and reached Level 8 out of 10 in the Local Guides program. I wrote a post on Local Guides Connect sharing what I learned from various trips, and used the Local Guides Connect forum to ask for suggestions on places to visit.I even made a ton of friends from the program along the way through, thanks to taking part in conversations on the forum. My fellow Local Guides have helped me check out national parks to visit in Italy and places to visit in Portland, Oregon. And this past October, I was one of 151 Local Guides selected to attend the Connect Live 2018 conference in San Francisco.Throughout my time as a full-time traveler, and as a Local Guide, I’ve picked up some tips on how to experience the world, even when you’re alone. Here’s my advice on how to plan a successful solo trip, with help from Google.Visiting Hobbit houses in New Zealand.Plan your itinerary wisely.  Start off by creating a list on Google Maps of all the places you want to visit. Once you have your destination in mind, use the Google Trips app to plan your travel and organize your itinerary. And if you want specific recommendations while you’re there, check out the Local Guides Connect forum.One of the best parts of my Europe solo trip was meeting a Local Guides Connect moderator, Ermes. I was in Venice and had messaged him the previous day about my travel. In spite of the short notice, he drove down and met me! I got a Local Guide to show me around Venice, and it was so helpful and memorable. If you can’t get a Local Guide to be your personal advisor, try signing up for organized tours. You can relax and enjoy the scenery as a tour guide shares stories of local places, and navigates you through a new city.Save money wherever you can.If you use a travel agent, have them get you itineraries, but plan and book the places yourself to save some cash. Use public transportation wherever possible, and try staying at a hostel for a cheaper stay that lets you meet, mingle and share your travel experiences with other visitors. And if you need to apply for a visa, do it yourself, so you have more money for that kayaking trip of yours.Taking in the sights in Paris. Stay safe.Download an area of the city you’re visiting on Google Maps so that it is available offline. That way, you don’t have to worry about finding your way when you don’t have an internet connection. Share your real-time location on Google Maps with someone your trust, like your family or close friends. This way they know where you are. (My parents love this feature.)Make the most of the daylight by waking up early and visiting places you want to see in the first part of the day. And ask someone at your hotel, hostel or vacation rental whether it is safe to walk in your neighborhood at night, or use public transportation after dark. And use the Google Translate app when you are in a foreign country and don’t understand the language.Keep an eye on what’s next.The most rewarding part of solo travel is the luxury of designing your own itinerary and seeing your travel plans come to life. For me, the next place I hope to visit is Banff, Canada, and I’ll bring my knowledge as a solo traveler (and Local Guide) along for the ride. […]

  • How companies are finding the right device with Android Enterprise Recommended

    We recently returned from Mobile World Congress, where many of our hardware partners showcased their latest devices and how Android is shaping the future of mobility. When we launched  the Android Enterprise Recommended program, our goal was to provide the mobile ecosystem with powerful and versatile solutions, validated for enterprise use. The first validated knowledge-worker devices debuted just over a year ago, and since then the program has expanded to rugged devices, enterprise mobility management solutions and most recently, managed service providers. Today, we’re sharing highlights about how our partners are embracing Android Enterprise Recommended for how it provides choice, cost savings, and empowers companies to choose devices with confidence.Embracing device choiceSAP, a leader in enterprise application software, wanted to give its teams greater device choice and embrace the latest Android Enterprise management features. The company was looking for a solution to transition 9,000 corporate-owned devices onto modern Android Enterprise management for the separation of business and personal data delivered by the work profile.Jarmo Akkanen, SAP Global Service Owner, Mobile Operations, said with Android Enterprise Recommended the company was able to confidently choose devices that it knew met strict security requirements and supported rapid deployment features: “We urgently wanted to offer our colleagues more choice for their mobile workplace. We found that Google’s Android Enterprise Recommended program is a good opportunity to broaden our portfolio of managed company-owned smartphones.”Lower cost and speedy deploymentAs part of this device strategy, the SAP IT team was also looking for a way to reduce the total cost of ownership with high-quality devices across various price points. Nokia has a diverse portfolio of Android Enterprise Recommended devices that range from high-end to more cost-efficient options. This gave SAP flexibility in choosing devices that met the same rigorous standards for the enterprise, regardless of cost.Part of the equation in lowering costs is time—SAP is transitioning to zero-touch enrollment for all Android Enterprise Recommended devices so the company can deploy corporate devices in bulk without any manual setup. Employees will get their device with the right apps and management settings already configured.Finding great devicesWhen Yorkshire Building Society (YBS), a financial institution based in the UK, had numerous devices that were either unmanaged or running a legacy operating system and management solution, the company’s IT team was eager to modernize its management framework. After investigating both iOS and Android-based approaches, the YBS IT team migrated its mobile device infrastructure to Android Enterprise Recommended devices. The company made the Nokia 7 Plus its new corporate standard phone and developed a company app store to distribute applications through managed Google Play.YBS’ End User Computing Delivery Manager Andrew Ellison said employee feedback about the transition was positive, and the new phones bring together a mix of smooth and consistent software experience with excellent hardware. “Thanks to the migration to Android, we will give our employees a good user experience and offer an extended set of services and software to them,” he said. “With Android Enterprise we will be able to offer our colleagues a personal user experience on corporate owned devices without compromising security and manageability.”These are just a couple of examples of what companies are doing with Android. We’re looking forward to hearing how more customers and partners are embracing the potential of Android. […]

  • 2018, celebrating our global Webmaster community

    2018 has been a very important year for our webmaster support community. What has happened? There’s been a program rebranding, a global summit, and loads of community hangouts.In October, the former Top Contributors became Gold Product Experts, and the Rising Stars, Silver Product Experts. This rebranding happened throughout all of the product forums and these are some of the new badges and names:Silver Product Expert: Newer members who are developing their product knowledgeGold Product Expert: Trusted members who are knowledgeable and active contributorsIn November, we invited all of our Gold Product Experts from every Google help forum (such as Blogger or Google My Business) to a global summit. This meetup happened in the Google campus in Sunnyvale, California. Out of the almost 550 attendees from all over the world, around 70 were Webmaster Gold Product Experts. Coming from 25 different countries, this was the second biggest community that attended the event. Later that month, another very successful meetup took place in Moscow, gathering 23 Russian speaking Product Experts (of which 10 were Webmasters).Gold Webmaster Product Experts at this year’s global summit in Sunnyvale.Many of the attendees acknowledged that this “was a really valuable time”, that the “sessions were very insightful and interesting” and that “the entire event was fantastic!”.This knowledgeable group of super users provides invaluable help in 16 languages to more than 2 million users a year, about everything related to Search, Structured Data or Search Console in the forums.And what is the profile of our community? Many of our Product Experts (Silver and Gold) are site owners who started out on the Webmaster forums (some more than a decade ago) by asking questions about their own sites. After their issues were fixed, most of them stayed to give back to the community, as they realized that their expertise could be of use to others. We want to thank all of our experts for their dedication and constant knowledge sharing to help users who are having trouble with their websites.Throughout the year, we’ve held 75 live office hours hangouts on the Webmaster YouTube channel, in English, Japanese, German, Hindi, French, and we’ve also kick started the calls in Spanish. On those hangouts, anyone can raise their questions to the Google team directly and interact with one another.If you’re interested in joining the community, meeting everyone and helping other users on the Webmaster forums, you can learn more on the Product Experts program website. We are always excited to meet users from diverse backgrounds and skill-sets!Looking forward to what 2019 will bring to our community... And looking forward to meeting you!Written by Aurora Morales, Trust & Safety Outreach team […]

  • An update on the Google Webmaster Central blog comments

    For every train there's a passenger, but it turns out comments are not our train.Over the years we read thousands of comments we've received on our blog posts on the Google Webmaster Central blog. Sometimes they were extremely thoughtful, other times they made us laugh out loud, but most of the time they were off-topic or even outright spammy; if you think about it, the latter is rather ironic, considering this is the Google Webmaster Blog.Effective today, we're closing the commenting feature on the Google Webmaster Central blog. Instead of reading the comments here on the blog, we're going to focus on interacting with the community on our other channels. For all of our subsequent posts, if you have comments, feedback, or funny stories, you can find us in our help forums or on Twitter.Posted by Gary, House elf […]

  • An update to referral source URLs for Google Images

    UPDATE: After testing and further consideration, we have determined that the best place to measure query and click traffic from Google Images is in the Search Console Performance Report. Accordingly, we will continue to use (or the appropriate ccTLD) as the referrer URL for all traffic from Google Images, and will not be providing a Google Images specific referrer URL ( day, hundreds of millions of people use Google Images to visually discover and explore content on the web. Whether it be finding ideas for your next baking project, or visual instructions on how to fix a flat tire, exploring image results can sometimes be much more helpful than exploring text. Updating the referral sourceFor webmasters, it hasn't always been easy to understand the role Google Images plays in driving site traffic. To address this, we will roll out a new referrer URL specific to Google Images over the next few months. The referrer URL is part of the HTTP header, and indicates the last page the user was on and clicked to visit the destination webpage. If you create software to track or analyze website traffic, we want you to be prepared for this change. Make sure that you are ingesting the new referer URL, and attribute the traffic to Google Images. The new referer URL is: If you use Google Analytics to track site data, the new referral URL will be automatically ingested and traffic will be attributed to Google Images appropriately. Just to be clear, this change will not affect Search Console. Webmasters will continue to receive an aggregate list of top search queries that drive traffic to their site. How this affects country-specific queriesThe new referer URL has the same country code top level domain (ccTLD) as the URL used for searching on Google Images. In practice, this means that most visitors worldwide come from That's because last year, we made a change so that became the default choice for searchers worldwide. However, some users may still choose to go directly to a country specific service, such as for the UK. For this use case, the referer uses that country TLD (for example, We hope this change will foster a healthy visual content ecosystem. If you're interested in learning how to optimize your pages for Google Images, please refer to the Google Image Publishing Guidelines. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum. Posted by Ashutosh Agarwal, Product Manager, Google Images […]

  • Announcing domain-wide data in Search Console

    Google recommends verifying all versions of a website -- http, https, www, and non-www -- in order to get the most comprehensive view of your site in Google Search Console. Unfortunately, many separate listings can make it hard for webmasters to understand the full picture of how Google “sees” their domain as a whole. To make this easier, today we're announcing "domain properties" in Search Console, a way of verifying and seeing the data from Google Search for a whole domain. Domain properties show data for all URLs under the domain name, including all protocols, subdomains, and paths. They give you a complete view of your website across Search Console, reducing the need to manually combine data. So regardless of whether you use m-dot URLs for mobile pages, or are (finally) getting the migration to HTTPS set up, Search Console will be able to help with a complete view of your site's data with regards to how Google Search sees it. If you already have DNS verification set up, Search Console will automatically create new domain properties for you over the next few weeks, with data over all reports. Otherwise, to add a new domain property, go to the property selector, add a new domain property, and use DNS verification.We recommend using domain properties where possible going forward. Domain properties were built based on your feedback; thank you again for everything you've sent our way over the years! We hope this makes it easier to manage your site, and to get a complete overview without having to manually combine data. Should you have any questions, feel free to drop by our help forums, or leave us a comment on Twitter. And as always, you can also use the feedback feature built in to Search Console as well. Posted by Erez Bixon, Search Console Team […]

  • Collaboration and user management in the new Search Console

    As part of our reinvention of Search Console, we have been rethinking the models of facilitating cooperation and accountability for our users. We decided to redesign the product around cooperative team usage and transparency of action history. The new Search Console will gradually provide better history tracking to show who performed which significant property-affecting modifications, such as changing a setting, validating an issue or submitting a new sitemap. In that spirit we also plan to enable all users to see critical site messages. New featuresUser management is now an integral part of Search Console.The new Search Console enables you to share a read-only view of many reports, including Index coverage, AMP, and Mobile Usability. Learn more.A new user management interface that enables all users to see and (if appropriate), manage user roles for all property users. New Role definitionIn order to provide a simpler permission model, we are planning to limit the "restricted" user role to read-only status. While being able to see all information, read-only users will no longer be able to perform any state-changing actions, including starting a fix validation or sharing an issue. Best practicesAs a reminder, here are some best practices for managing user permissions in Search Console: Grant users only the permission level that they need to do their work. See the permissions descriptions.If you need to share an issue details report, click the Share link on that page.Revoke permissions from users who no longer work on a property.When removing a previous verified owner, be sure to remove all verification tokens for that user.Regularly audit and update the user permissions using the Users & Permissions page in new Search Console. User feedbackAs part of our Beta exploration, we released visibility of the user management interface to all user roles. Some users reached out to request more time to prepare for the updated user management model, including the ability of restricted and full users to easily see a list of other collaborators on the site. We’ve taken that feedback and will hold off on that part of the launch. Stay tuned for more updates relating to collaboration tools and changes on our permission models. As always, we love to hear feedback from our users. Feel free to use the feedback form within Search Console, and we welcome your discussions in our help forums as well! Posted by John Mueller, Google Switzerland […]

  • Consolidating your website traffic on canonical URLs

    In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content. To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits: It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property. For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the "canonical" property). It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property. When will this happen? We plan to transition all performance data on April 10, 2019. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences. API and Data Studio users: The Search Console API will change to canonical data on April 10, 2019. How will this affect my data? At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL. At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your "canonical property". Your alternate property traffic probably won't drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages). You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic. You can see some examples of these traffic changes below. Preparing for the change Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties. Modify any custom traffic reports you might have created in order to adapt for this traffic shift. If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool. If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report's Export Data button, or using the Search Console API. Examples Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called and alternate site (called Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool. Total traffic In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property. Canonical property( Alternate property( Current New, based on canonical URLs Change +0.7K     |        +3K -0.7K        |          -3K Individual page traffic You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL: Canonical property( Alternate property( Old New Change+150     |        +800-150     |        -800 Mobile traffic In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the "Device: Mobile" filter as shown here: Canonical property( Alternate property( Old New Change+0.7K      | +3K-0.7K      | -3K In conclusion We know that this change might seem a little confusing at first, but we're confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Forum.Posted by John Mueller, Developer Advocate, Zuric […]

  • Dynamic Rendering with Rendertron

    Many frontend frameworks rely on JavaScript to show content. This can mean Google might take some time to index your content or update the indexed content. A workaround we discussed at Google I/O this year is dynamic rendering. There are many ways to implement this. This blog post shows an example implementation of dynamic rendering using Rendertron, which is an open source solution based on headless Chromium.Which sites should consider dynamic rendering?Not all search engines or social media bots visiting your website can run JavaScript. Googlebot might take time to run your JavaScript and has some limitations, for example. Dynamic rendering is useful for content that changes often and needs JavaScript to display. Your site's user experience (especially the time to first meaningful paint) may benefit from considering hybrid rendering (for example, Angular Universal). How does dynamic rendering work?Dynamic rendering means switching between client-side rendered and pre-rendered content for specific user agents. You will need a renderer to execute the JavaScript and produce static HTML. Rendertron is an open source project that uses headless Chromium to render. Single Page Apps often load data in the background or defer work to render their content. Rendertron has mechanisms to determine when a website has completed rendering. It waits until all network requests have finished and there is no outstanding work. This post covers: Take a look at a sample web appSet up a small express.js server to serve the web appInstall and configure Rendertron as a middleware for dynamic renderingThe sample web appThe “kitten corner” web app uses JavaScript to load a variety of cat images from an API and displays them in a grid. Cute cat images in a grid and a button to show more - this web app truly has it all! Here is the JavaScript:   const apiUrl = '';   const tpl = document.querySelector('template').content;   const container = document.querySelector('ul');   function init () {     fetch(apiUrl)     .then(response => response.json())     .then(cats => {       container.innerHTML = '';       cats         .map(cat => {           const li = document.importNode(tpl, true);           li.querySelector('img').src = cat.url;           return li;         }).forEach(li => container.appendChild(li));     })   }   init();   document.querySelector('button').addEventListener('click', init);The web app uses modern JavaScript (ES6), which isn't supported in Googlebot yet. We can use the mobile-friendly test to check if Googlebot can see the content:The mobile-friendly test shows that the page is mobile-friendly, but the screenshot is missing all the cats! The headline and button appear but none of the cat pictures are there.While this problem is simple to fix, it's a good exercise to learn how to setup dynamic rendering. Dynamic rendering will allow Googlebot to see the cat pictures without changes to the web app code.Set up the serverTo serve the web application, let's use express, a node.js library, to build web servers.The server code looks like this (find the full project source code here):const express = require('express');const app = express();const DIST_FOLDER = process.cwd() + '/docs';const PORT = process.env.PORT || 8080;// Serve static assets (images, css, etc.)app.get('*.*', express.static(DIST_FOLDER));// Point all other URLs to index.html for our single page appapp.get('*', (req, res) => {  res.sendFile(DIST_FOLDER + '/index.html');});// Start Express Serverapp.listen(PORT, () => {  console.log(`Node Express server listening on http://localhost:${PORT} from ${DIST_FOLDER}`);});You can try the live example here - you should see a bunch of cat pictures, if you are using a modern browser. To run the project from your computer, you need node.js to run the following commands: npm install --save express rendertron-middleware node server.js Then point your browser to http://localhost:8080. Now it’s time to set up dynamic rendering. Deploy a Rendertron instanceRendertron runs a server that takes a URL and returns static HTML for the URL by using headless Chromium. We'll follow the recommendation from the Rendertron project and use Google Cloud Platform.The form to create a new Google Cloud Platform project.Please note that you can get started with the free usage tier, using this setup in production may incur costs according to the Google Cloud Platform pricing.Create a new project in the Google Cloud console. Take note of the “Project ID” below the input field.Install the Google Cloud SDK as described in the documentation and log in.Clone the Rendertron repository from GitHub with:git clone cd rendertron Run the following commands to install dependencies and build Rendertron on your computer:npm install && npm run buildEnable Rendertron’s cache by creating a new file called config.json in the rendertron directory with the following content:{ "datastoreCache": true }Run the following command from the rendertron directory. Substitute YOUR_PROJECT_ID with your project ID from step 1.gcloud app deploy app.yaml --project YOUR_PROJECT_IDSelect a region of your choice and confirm the deployment. Wait for it to finish.Enter the URL (substitute YOUR_PROJECT_ID for your actual project ID from step 1 in your browser. You should see Rendertron’s interface with an input field and a few buttons.Rendertron’s UI after deploying to Google Cloud PlatformWhen you see the Rendertron web interface, you have successfully deployed your own Rendertron instance. Take note of your project’s URL ( as you will need it in the next part of the process.Add Rendertron to the serverThe web server is using express.js and Rendertron has an express.js middleware. Run the following command in the directory of the server.js file: npm install --save rendertron-middlewareThis command installs the rendertron-middleware from npm so we can add it to the server:const express = require('express');const app = express();const rendertron = require('rendertron-middleware');Configure the bot listRendertron uses the user-agent HTTP header to determine if a request comes from a bot or a user’s browser. It has a well-maintained list of bot user agents to compare with. By default this list does not include Googlebot, because Googlebot can execute JavaScript. To make Rendertron render Googlebot requests as well, add Googlebot to the list of user agents:const BOTS = rendertron.botUserAgents.concat('googlebot');const BOT_UA_PATTERN = new RegExp(BOTS.join('|'), 'i');Rendertron compares the user-agent header against this regular expression later.Add the middlewareTo send bot requests to the Rendertron instance, we need to add the middleware to our express.js server. The middleware checks the requesting user agent and forwards requests from known bots to the Rendertron instance. Add the following code to server.js and don’t forget to substitute “YOUR_PROJECT_ID” with your Google Cloud Platform project ID:app.use(rendertron.makeMiddleware({  proxyUrl: '',  userAgentPattern: BOT_UA_PATTERN}));Bots requesting the sample website receive the static HTML from Rendertron, so the bots don’t need to run JavaScript to display the content.Testing our setupTo test if the Rendertron setup was successful, run the mobile-friendly test again.Unlike the first test, the cat pictures are visible. In the HTML tab we can see all HTML the JavaScript code generated and that Rendertron has removed the need for JavaScript to display the content.ConclusionYou created a dynamic rendering setup without making any changes to the web app. With these changes, you can serve a static HTML version of the web app to crawlers.Posted by Martin Splitt, Open Web Unicorn […]

  • Focusing on the new Search Console

    Over the last year, the new Search Console has been growing and growing, with the goal of making it easier for site owners to focus on the important tasks. For us, focus means being able to put in all our work into the new Search Console, being committed to the users, and with that, being able to turn off some of the older, perhaps already-improved, aspects of the old Search Console. This gives us space to further build out the new Search Console, adding and improving features over time. Here are some of the upcoming changes in Search Console that we're planning on making towards end of March, 2019: Crawl errors in the new Index Coverage report One of the more common pieces of feedback we received was that the list of crawl errors in Search Console was not actionable when it came to setting priorities (it's normal that Google crawls URLs which don't exist, it's not something that needs to be fixed on the website). By changing the focus on issues and patterns used for site indexing, we believe that site owners will be able to find and fix issues much faster (and when issues are fixed, you can request reprocessing quickly too). With this, we're going to remove the old Crawl Errors report - for desktop, smartphone, and site-wide errors. We'll continue to improve the way issues are recognized and flagged, so if there's something that would help you, please submit feedback in the tools. Along with the Crawl Errors report, we're also deprecating the crawl errors API that's based on the same internal systems. At the moment, we don't have a replacement for this API. We'll inform API users of this change directly. Sitemaps data in Index Coverage As we move forward with the new Search Console, we're turning the old sitemaps report off. The new sitemaps report has most of the functionality of the old report, and we're aiming to bring the rest of the information - specifically for images & video - to the new reports over time. Moreover, to track URLs submitted in sitemap files, within the Index Coverage report you can select and filter using your sitemap files. This makes it easier to focus on URLs that you care about. Using the URL inspection tool to fetch as Google The new URL inspection tool offers many ways to check and review URLs on your website. It provides both a look into the current indexing, as well as a live check of URLs that you've recently changed. In the meantime, this tool shows even more information on URLs, such as the HTTP headers, page resource, the JavaScript console log, and a screenshot of the page. From there, you can also submit pages for re-processing, to have them added or updated in our search results as quickly as possible. User-management is now in settings We've improved the user management interface and decreased clutter from the tool by merging it with the Settings section of the new Search Console. This replaces the user-management features in the old Search Console. Structured data dashboard to dedicated reports per vertical To help you implement Rich Results for you site, we added several reports to the new Search Console last year. These include Jobs, Recipes, Events and Q&A. We are committed to keep adding reports like these to the new Search Console. When Google encounters a syntax error parsing Structured Data on a page, it will also be reported in aggregate to make sure you don’t miss anything critical. Other Structured Data types that are not supported with Rich Results features, will not be reported in Search Console anymore. We hope this reduces distraction from non-critical issues, and help you to focus on fixing problems which could be visible in Search. Letting go of some old features With the focus on features that we believe are critical to site owners, we've had to make a hard decision to drop some features in Search Console. In particular: HTML suggestions - finding short and duplicated titles can be useful for site owners, but Google's algorithms have gotten better at showing and improving titles over the years. We still believe this is something useful for sites to look into, and there are some really good tools that help you to crawl your website to extract titles & descriptions too. Property Sets - while they're loved by some site owners, the small number of users makes it hard to justify maintaining this feature. However, we did learn that users need a more comprehensive view of their website and so we will soon add the option of managing a search console account over an entire domain (regardless of schema type and sub-domains). Stay tuned! Android Apps - most of the relevant functionality has been moved to the Firebase console over the years. Blocked resources - we added this functionality to help sites with unblocking of CSS and JavaScript files for mobile-friendliness several years back. In the meantime, these issues have gotten much fewer, the usage of this tool has dropped significantly, and you're able to find blocked resources directly in the URL inspection tool. Please send us feedback! We realize some of these changes will affect your work-flows, so we want to let you know about them as early as possible. Please send us your feedback directly in the new Search Console, if there are aspects which are unclear, or which would ideally be different for your use-case. For more detailed feedback, please use our help forums, feel free to include screenshots & ideas. In the long run, we believe the new Search Console will make things much easier, help you focus on the issues affecting your site, and the opportunities available to your site, with regards to search. We're looking forward to an exciting year!Posted by Hillel Maoz, Search Console Team […]

  • Google is introducing its Product Experts Program!

    Over 12 years ago, we started answering webmaster questions and listening to feedback on our webmaster forums (although at the time, it was a Google Group for questions about sitemaps - original announcement). From a small mailing list, these forums have evolved to cover 15 languages and over 50,000 threads per year. These days, we learn a lot from some of the cases surfaced on this platform, and constantly use it to gather feedback to pass on to our teams.Google’s Top Contributors () and Rising Stars () are some of our most active and helpful members on these forums. With over 100 members globally just for the Webmaster Forums (1000 members if you count all product forums), this community of experts helps thousands of people every year by sharing their knowledge and helping others get the most out of Google products.Some of the Webmaster forum participantsToday, we’re excited to announce that we’re rebranding and relaunching the Top Contributor program as Google’s Product Experts program! Same community of experts, shiny new brand.Over the following days, we’ll be updating our badges in the forums so you can recognize who our most passionate and dedicated Product Experts are:   Silver Product Expert: Newer members who are developing their product knowledge   Gold Product Expert: Trusted members who are knowledgeable and active contributors   Platinum Product Expert: Seasoned members who contribute beyond providing help through mentoring, creating content, and more   Product Expert Alumni: Past members who are no longer active, but were previously recognized for their helpfulnessMore information about the new badges and names.Those Product Experts are users who are passionate about Google products and enjoy helping other users. They also help us by giving feedback on the tools we all use, like the Search Console, by surfacing questions they think Google should answer better, etc… Obtaining feedback from our users is one of Google’s core values, and Product Experts often have a great understanding of what affects a lot of our users. For example, here is a blog post detailing how Product Expert feedback about the Search Console was used to build the new version of the tool.Visit the new Product Experts program website to get information on how to become a Product Expert yourself, and come and join us on our Webmaster forums, we’d love to hear from you!Written by Vincent Courson, Search Outreach team […]

  • Help Google Search know the best date for your web page

    Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy. How dates are determined Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content: Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup. Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated. How to specify a date on a page To help Google to pick the right date, site owners and publishers should: Show a clear date: Show a visible date prominently on the page. Use structured data: Use the datePublished and dateModified schema with the correct time zone designator for AMP or non-AMP pages. When using structured data, make sure to use the ISO 8601 format for dates. Guidelines specific to Google News Google News requires clearly showing both the date and the time that content was published or updated. Structured data alone is not enough, though it is recommended to use in addition to a visible date and time. Date and time should be positioned between the headline and the article text. For more guidance, also see our help page about article dates. If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don't artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That's against our article URLs guidelines. More best practices for dates on web pages In addition to the most important requirements listed above, here are additional best practices to help Google determine the best page to consider showing for a web page: Show when a page has been updated: If you update a page significantly, also update the visible date (and time, if you display that). If desired, you can show two dates: when a page was originally published and when it was updated. Just do so in a way that’s visually clear to your readers. If showing both dates, it’s also highly recommended to use datePublished and dateModified for AMP or non-AMP pages to make it easier for algorithms to recognize. Use the right time zone: If specifying a time, make sure to provide the correct timezone, taking into account daylight saving time as appropriate. Be consistent in usage. Within a page, make sure to use exactly the same date (and, potentially, time) in structured data as well as in the visible part of the page. Make sure to use the same timezone if you specify one on the page. Don’t use future dates or dates related to what a page is about: Always use a date for when a page itself was published or updated, not a date linked to something like an event that the page is writing about, especially for events or other subjects that happen in the future (you may use Event markup separately, if appropriate). Follow Google's structured data guidelines: While Google doesn't guarantee that a date (or structured data in general) specified on a page will be used, following our structured data guidelines does help our algorithms to have it available in a machine-readable way. Troubleshoot by minimizing other dates on the page: If you’ve followed the best practices above and find incorrect dates are being selected, consider if you can remove or minimize other dates that may appear on the page, such as those that might be next to related stories. We hope these guidelines help to make it easier to specify the right date on your website's pages! For questions or comments on this, or other structured data topics, feel free to drop by our webmaster help forums. Posted by John Mueller, Developer Advocate, Zuric […]

Today: 20-Mar-2019 03:39:36
Topic date : Saturday
World date: 20-03-19 03:39:36
Tags: , , , , , , , , , , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

Read previous post:

Blogs on Health.