Skip to main content

Got an awesome photo sphere? Share it on Google maps!

Google Street View is one of our favorite parts of Google maps. You may have seen one of the Street View cars driving around taking 360-degree photos of the streets in your town, but did you know it's possible to submit your photos to Google maps? There are two ways to do it, one of which is a lot easier than the next.

Submitting a photo sphere is the best way to contribute to Google Maps. They're exactly how they sound. Photospheres are 360-degree views of an area. You can use a 360-degree camera to create one or use the specially built camera app in the Street View app. For some phones, this is the only way to capture 360-degree photos. Phones like the Pixel have a photosphere mode in the built-in camera app.

You can create a photosphere with your Pixel phone camera app, like the Google Pixel 5a. You can also create a photosphere with any of the best Android phones using the Google Street View app.

How to submit your own Street View images to Google Maps (the easy way)

The easiest way (and for some phones, the only way) to submit a photosphere to Google maps is to go to the location and submit the photosphere as you make it. To start, go to the area where you want to take your photosphere.

  1. Open the Google Street View app. You'll see small blue circles showing you where photospheres already exist. No sense in duplicating work, after all.
  2. Walk to the place where you want to take the photosphere.
  3. Tap Create.
  4. Tap Photo Sphere.
  5. Your camera will open and display a small field of view and a circle in the middle of the display. Take your first photo.

  6. From there, you'll need to rotate your camera around and take photos of everything around you. To help, your camera viewfinder will place dots where you should aim your camera. Position the circle in the viewfinder around those dots, and the camera will automatically snap the photo. Then just move on to the next dot.

  7. When you're done, the app will take you to your profile. There, you can see the photosphere being prepped for upload. Tap Publish. By default, your phone will wait until you connect to Wi-Fi to upload.

That's it. The photosphere will be geotagged and placed into Google maps at the location where it was taken.

How to submit your own Street View images to Google Maps (the not-so-easy way)

You may already have a photosphere that you've taken in the past that you want to submit to Google Street View, but alas, you've already left. Alternatively, you may be in a location and want to take a photosphere and only later remember that it might be a good addition. Whatever the case, you did not take the photosphere through the Street View app. You can still add it.

  1. Open Google Street View and tap Create.
  2. Tap Import 360 photos.
  3. Navigate to where you have the photosphere saved and tap on it.

  4. You'll be taken to your profile with a preview of the photosphere. Tap Set Location.

  5. This is the not-so-easy part. By default, Google will place you wherever you are, and you'll need to drag the little 1 to the location where the photosphere was taken. If you are nearby, that's not so hard. If you're at home, miles away, it's a bit harder. You can pinch to zoom in and out like a normal map, which will help you considerably.
  6. Once you have the location, tap the checkmark.
  7. You'll be taken back to your profile. Tap Publish. Again, by default, your phone will wait for Wi-Fi.

Photosphere tips and other test notes

To start, we'd like to acknowledge user ThatAdamGuy in our forums. He submitted some tips for taking great photo spheres. When you're taking your photosphere, rather than rotate your body, it's best to try to pivot your phone while keeping it in as close to the same place as possible. That's not always realistic since you need to see what you're pointing at, but the less you move your phone in 3D space, the better.

Cloudy days work better for photospheres than sunny days. That's often said in photography in general, but harsh sunlight can be rough on your surroundings, as can pointing your phone directly at the sun for one of those photos in the sphere. Diffused light is always better.

Google doesn't do a whole lot (if any) QA before it publishes these photo spheres to the public. That is evidenced by the comically terrible photo sphere we took as part of this how-to article. Remember above when we talked about not moving your phone in 3D space? Piers in the water are not friendly to that.

Finally, locations of photospheres can be edited. If you place a sphere and later realize it looks a little off, long-tap it in your profile and tap the three dots > Set Location. You can then fine-tune the location as needed.



Source: androidcentral

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Here Are the macOS Sequoia Features Intel Macs Won't Support

When Apple released macOS Monterey in 2021, some key features required a Mac with Apple silicon. The same scenario played out with macOS Ventura in 2022, and then again the following year with the release of macOS Sonoma. With macOS Sequoia set to arrive in the fall, which new features can Intel Mac owners expect to be unavailable to them this time around? Apple says that macOS Sequoia is compatible with the same Macs as macOS Sonoma, but Apple's fine print reveals that certain new features won't work on Intel machines. If you're still on an Intel Mac, here's what you won't have access to. Apple Intelligence Apple Intelligence , a deeply integrated, personalized AI feature set for Apple devices that uses cutting-edge generative artificial intelligence to enhance the user experience, won't be available on Intel Macs. Apple says the advanced features require its M1 chip or later, so if your Mac was released before November 2020, you're out of luck. T

iPhone 16 Pro Models to Adopt 'M14' Advanced Samsung OLED Panels for Improved Brightness and Lifespan

The upcoming iPhone 16 Pro and iPhone 16 Pro Max will be the first Apple smartphones to adopt Samsung's high performance "M14" OLED display panel, claims a new report coming out of South Korea. According to ETNews , Samsung's "M" series of OLED panels are made for flagship smartphones, while "14" refers to the number of high-performance materials used to produce them. "M14" is the first series of its kind, and the panel is said to have been developed to deliver superior brightness and longevity. Samsung has reportedly placed orders for the M14 materials and is preparing to mass produce the displays in the second half of the year for Apple's iPhone 16 Pro models. Google's Pixel 9 smartphone is the only other device that is expected to adopt the high-performance displays in 2024. A previous report out of China claimed that this year's ‌iPhone 16 Pro‌ models will feature up to 1,200 nits of typical SDR brightness – a 20%

Apple Boosts A18 Chip Orders in Anticipation of High iPhone 16 Demand

Apple is said to have upped its order of next-generation chips from TSMC to between 90 million and 100 million units, following heightened demand expectations for its iPhone 16 series. Last year's initial chip order volume for the iPhone 15 series launch is believed to have been in the region of 80-90 million units, suggesting Apple is anticipating higher demand for its 2024 devices, according to Taiwanese outlet CTEE . The arrival of Apple Intelligence in iOS 18 is expected to boost initial sales of the devices. One of the reasons is that Apple Intelligence requires at least an iPhone 15 Pro to run, which means owners of last year's iPhone 15 and iPhone 15 Plus will miss out on Apple's new AI features unless they upgrade to an iPhone 15 Pro or plump for one of the iPhone 16 models. Last year, the iPhone 15 and iPhone 15 Plus were equipped with the A16 Bionic chip – the same chip that was in the iPhone 14 Pro models – whereas the iPhone 15 Pro and iPhone 15 Pro Max f