Tagged: Pixel 2

Google ‘working on a fix’ for fatal camera error affecting Pixel 2 phones

Ever since Google’s Pixel 2 and Pixel 2 XL smartphones arrived last fall, they have been affected by a fatal error that causes the camera app to crash when opening it up or trying to snap photos. Not everyone has been affected by this issue, but Google hasn’t even really addressed it until an official Google Twitter account responded to one frustrated consumer earlier this week. The team is apparently “working on a fix” for the error now.

On Sunday, a Pixel 2 XL user expressed her frustration with the fatal camera error on Twitter, to which Google swiftly responded. The Made by Google Twitter suggested she clear the Camera app’s cache, at which point she explained that she had already cleared the cache and performed a full factory reset to no avail.

Minutes later, the Made by Google Twitter account responded once again, now suggesting that she put the phone in airplane mode and try to take a picture. Putting aside the absurdity of this solution, it didn’t work either. She received one final note when she explained that nothing she had tried was helping, revealing that the Pixel team is aware of the “fatal camera error” and is currently working on a solution:

https://twitter.com/madebygoogle/status/1015993578185773056

Reports of users experiencing this error started popping up within weeks of the Pixel 2’s launch, which makes it all the more baffling that it took Google this long to issue a response. And keep in mind that this is just a tweet spotted by a Redditor, not an official statement. Now the question is whether or not Google will issue a fix before the Pixel 3 is unveiled this fall. Will Google release a new phone before it even fixes its current flagship?

Android 9.0 inches closer to launch as Android P Beta 3 rolls out

Google on Monday afternoon dropped the third public beta of Android P, which is supposed to be a close-to-final version of Android 9.0 that will be released at some point this summer. If you’re already on Android P, either on a Pixel device or on a handset from the other OEMs that partnered with Google for the beta, you can seamlessly upgrade to the new beta. It’s just like a regular software update.

If you want to give Android P a try for the first time now that it’s more stable, you need to enroll in the beta program, which is available at this link. In a post on Google’s Android blog, Google VP of engineering Dave Burke explains that the new beta “takes us very close to what you’ll see in the final version of Android P,” which is coming this summer.

Now that the developer APIs were finalized in the previous update, Android P Beta 3 includes bug fixes and optimizations “for stability and polish,” as well as the expected July 2018 security updates.

Like Google says, Android P Beta 3 isn’t big on new user features, but there are some changes that Android savvy users will notice right away. Droid-Life has a quick video of everything that’s new in the latest version of Android P:

https://www.youtube.com/watch?v=Fpn3KW_D1SU

Burke also said that Google will host a Reddit Ask-Me-Anything (AMA) session on July 19th to answer more questions about Android P. The final Android P version should be rolled out at some point in August, which is also when we’ll learn what dessert Google has chosen for Android P. Then Android users will just have to wait anywhere from a month to forever for their smartphone vendor and carrier to release the update for their particular phone model.

Best Buy deal takes $400 off a Pixel 2 XL – and no, you don’t have to buy two phones

The Pixel 3 and Pixel 3 XL are likely just months away from release, which means that carriers are stores are starting to hustle unsold Pixel 2 inventory out of the door. One of the most popular types of carrier deals these days are buy-one-get-ones, which generally make a second device free (in the form of 24 bill credits). They’re great deals in their own right, but if you don’t need two phones or an extra line of service, they’re not worth it.

Good deals on single devices are much rarer, which is why this Pixel 2 XL deal that’s being run by Verizon and Best Buy is definitely worth a look. You get a total of $400 off the RRP of a Pixel 2 XL, bringing the price down to $449.84 from $849.84. You get $200 off straight away, and the other $200 is applied as bill credits over 24 months.

There’s also a smaller deal running on the appropriately smaller Pixel 2 which takes a total of $200 off, $100 from Best Buy and the other $100 in 24 months of bill credits. Both deals require eligible wireless service on Verizon, and if you cancel wireless service early, you pay the remaining balance on your device. Full details are below:

Google Pixel 2 XL (SKU 6099989). Installment billing plan: Verizon Device Payment. Total Sale Price $449.84. Total Price $849.84. Total savings of $400 is comprised of $200 bill credit via Verizon and $200 instant savings via Best Buy. Must be activated with eligible wireless service (including voice and data). If wireless service is canceled, the installment agreement balance will be due. Taxes/surcharges due at time of purchase. No rainchecks. Savings are deducted off the full total price of the device and are reflected in reduced monthly payments over the life of the device; the bill credit savings are applied directly to the customer’s Verizon account and will not occur at Best Buy Point of Sale. Verizon Bill credit will be applied within 1–2 billing cycles. Offer may not be combinable with other credits, discounts and offers.

Apple might steal HTC’s signature squeeze control for the iPhone, but it will actually be good

HTC called it Edge Sense when the U11 came out last year, and then Google borrowed the function for the Pixel 2 phones, changing its name to Active Edge. As the name suggests, the edge lets you squeeze the frame to trigger a response, but it’s more of a gimmicky feature on the Pixel given that there’s not much you can do with it aside from calling for the Google Assistant or mute the phone. But Apple is working on a similar feature that might be actually useful.

Yes, the iPhone might copy the Pixel for a change, but don’t be fooled, this isn’t anything like Google’s way of copying the iPhone.

First discovered Digital Trends, US patent application no. 20180164166 is titled Mobile Electronic Device with Squeeze Detection, which perfectly explains what this Apple innovation is all about.

The patent was published on Thursday by the USPTO, but it happens to be a continuation of Apple patents from 2016 and 2013, both called Force Sensing Compliant Enclosure — like I said, don’t be fooled, this isn’t Apple stealing HTC or Google’s edges.

And while the Apple “Edge” may very well call upon Siri in an actual implementation, it’ll also be a lot more exciting than Google’s. If you know Apple, then you know the company’s ultimate iPhone design is a device that’s all-screen, without any physical buttons on it. Other Apple patents also detail inventions like wraparound displays with screens that would extend around the edges. And Apple is also working on foldable smartphones of its own.

This patent seems to be laying the groundwork for this all-screen iPhone of the future, as it’ll help Apple remove physical buttons and improve the overall design.

The patent describes technology that would allow Apple to include sensors (103) in the edges of the phone, which would detect, much like 3D Touch, the force with which you’re pressing against the handset’s external walls. And the iPhone may trigger different actions based on the location of the sensors:

The electronic device may include one or more processing units that receive and interpret data regarding the strain sensed by the sensor to determine one or more user inputs that correspond to the force applied to the deformable housing wall. In some cases the processing unit may analyze the data to determine an amount of the applied force and/or a location on the force compliant enclosure and/or electronic device where the force was applied. In various cases, the processing unit may determine the location where the force was applied and compare the determined location to a previously determined location where a previous force was applied to determine a movement between the two locations.

But wait, it gets even better than the Pixel. Say you’ve got an iPhone with a wraparound display, or an iPhone with a foldable screen. Its edges may contain such sensors. That’s because Apple’s tech could work with both housings made of metal, but also walls made of hard plastic:

Further, the deformable housing wall may be composed of a first material at portions other than strain concentration portion and a second material at the strain concentration portion. In some cases, such a first portion may be less deformable than the second material, such as where the first portion is a rigid material such as metal or hard plastic and the second portion is a soft material such as an elastomer or other material that is softer than the first material, in order to further maximize the strain at the strain concentrating portion. Alternatively, the second material may be more rigid than the first material in order to strengthen thinner strain concentrating portions and/or prevent stress fatigue.

That last sentence is also interesting from a different point of view. Remember Apple’s #Bendgate issue? Back then, we learned that the iPhone 6 Plus is more likely to bend right near the volume rockers. Button cut-outs in the aluminum frame were partly to blame.

And Apple may deploy these sensors in wall sections where you’d expect physical buttons — yes, Apple uses the word notch, but it means something entirely different.

In some implementations, the force sensing compliant enclosure may include multiple strain concentrating portions that may form pockets or notches in one or more deformable housing walls of the force sensing compliant enclosure that may each include one or more sensors. However, in other implementations the strain concentrating portion may comprise a groove that runs along an inner surface of one or more deformable housing walls, such as across an internal perimeter of the force sensing compliant enclosure.

A notable side-effect of a button-less design, not mentioned in the patent, would be the improvement of the iPhone’s ability to prevent water damage.

Like any Apple invention found on USPTO, there’s no telling when we’ll see it in commercial products, whether it’s the iPhone or other Apple products. if however, the iPhone will get an active edge of its own, expect others to copy it, Google included.

The Pixel 2 tech used for single-lens portrait mode photos is now open to anyone

When Google unveiled the Pixel 2 series last fall, the company made a big deal about its camera tech and stressed out the fact that it doesn’t need two cameras to offer users a portrait mode similar to what Apple introduced a year earlier.

Portrait mode on iPhone Plus and iPhone X models require the use of two cameras. Google’s Pixel 2 phones, meanwhile, do it with just one single lens. That’s because Google is using AI software to generate portrait modes, even though the end result might not be similar to Apple’s.

Google is now ready to make part of that technology available to anyone manufacturing phones out there. Google isn’t open-sourcing the entire Pixel 2 camera tech or the actual portrait mode on these phones, just the AI tool that makes it possible.

The company announced the move in a blog post earlier this week, but nobody will blame you if you missed it. It’s published on the research blog and has an unappealing title: Semantic Image Segmentation with DeepLab in Tensorflow
— unless you’re into AI, in which case you’ve probably seen it already.

What Google is open-sourcing is the DeepLab-v3+ code, an image segmentation tool that’s built using neural networks. Google has been employing machine learning for a few years now, looking to improve the quality and smarts of its Camera and Google Photos apps.

The image segmentation part is what’s interesting here, and what allows the Pixel 2 to take single-cam portrait mode shots. The AI recognizes objects in images in real-time, which makes possible the depth-of-field effect in pictures:

Semantic image segmentation, the task of assigning a semantic label, such as “road”, “sky”, “person”, “dog”, to every pixel in an image enables numerous new applications, such as the synthetic shallow depth-of-field effect shipped in the portrait mode of the Pixel 2 and Pixel 2 XL smartphones and mobile real-time video segmentation. Assigning these semantic labels requires pinpointing the outline of objects, and thus imposes much stricter localization accuracy requirements than other visual entity recognition tasks such as image-level classification or bounding box-level detection.

App developers and device makers can now use these Google innovations to create their own Pixel 2-like camera bokeh effects in devices that lack a secondary camera. That doesn’t mean other Android devices are about to replicate the entire Pixel 2 experience or the Pixel 2’s portrait mode, as Google is far from actually sharing those secrets. But this semantic image segmentation trick is a step in that direction.