Archive for the ‘Displays’ Category

While in Nashville, having just torched my insides with a fiery hot chicken sandwich from Prince’s, I passed by the all-glass enclosed Apple store on the corner of 5th & Broadway. It was about 6pm, and observations would indicate the evening had already started in the morning for most people. It was my last night in town and the Apple store got me thinking about the Vision Pro, my Quest 3, the 250,000 units Apple had sold to date, and of the articles claiming you could get a demo. So I walked in and asked “Can I get a demo of the Vision Pro?” “Sure! We may have an appointment left,” the friendly associate exclaimed. 7pm was available. I handed them my deets, and went off to drop some shopping goodies off at my hotel a few blocks away.

The Pre-Demo Setup

Before I even left the store, Apple had texted me a pre-demo survey. It was primarily concerned with whether I wore glasses (I do now), and whether I used them for near or far sightedness, or both. I answered the questions and Apple told me I was all set.

Apple’s pre-demo survey recommended I use the Zeiss inserts.

I strolled back into the store close to 7pm. They greeted me quickly with “You’re back!” Well, “I’m a man of my word” I exhorted, and followed them to the back of the store for the demo. Sam would be talking me through the demo. But first they needed my glasses so they could measure them with a fancy, expensive looking measurement machine. I think I’ve seen the same device at Warby Parker. They needed this to set up the Vision Pro for my exact prescription. This begged the question, “So what if my prescription changes? Do I need a new set of inserts?” “Yes, you would need to order new inserts” she explained. I didn’t see her add any inserts, so I was a bit confused by this, but why die on that hill? My guess is the Vision Pro adjusts by itself once provided the script details, but who knows. I sure hope I wouldn’t need to buy new adapters, err “inserts,” after already spending thousands of dollars on this thing. And of course – at this time – you can only buy Apple’s special Zeiss inserts, which I’m sure are a pretty penny.

The lens scanning machine.

After my eyes were ready, I also had to use an iPhone to scan my face. This process wasn’t working well until I moved to the solid-colored wall. The app just kept missing the scan. Sam was a bit frustrated as well, but she kept her cool.

Now, keep in mind, I’ve owned a Meta Quest 3 for a few months now. I was 100% comparing the setup process of that under $500 device to the setup process of a $3,500 (base!!) device here. With the Quest, I just put the unit on my head after some simple setup, and just kept my glasses on. I’m curious how much of this pomp and circumstance is actually necessary, or might be removed in a future software update for the Vision Pro.

Seeing all the work and equipment that went into just getting the unit to be “ready for me” helped me understand the price point. The optical equipment, the personnel, the technology for such a customized experience, has to come from somewhere. Given Apple would have raked in around $1.125B after selling 250K units, I hope they’ve recouped their costs. Now if only there were an iFixit teardown… Oh wait, there is!😀

Note that the entire demo was done while in a sitting position. I was sitting on a chair next to a wooden table. One other person was experiencing a demo at the same time.

Fit and Finish

Sam showed me what the buttons do – a button for the see-through mode, and a “digital crown” like the Apple watch. She also showed me the exact way to place the Vision Pro on my head. Thumb under the nosepiece, and four fingers on the top. Don’t touch the front! I asked what would happen if I did – would it just look ugly? She said yeah, it wouldn’t look good, but otherwise probably nothing. I followed her advice and put the unit on my head. I used the right-hand dial to tighen the unit as close as possible to my liking. Note that, because of the Zeiss inserts, I did not need my glasses on for the demo.

The “eyes” passthrough wasn’t part of the demo.

Once the Vision Pro was on my noggin’ I realized how heavy it is. I have a Quest 3 at home. This unit clearly felt heavier. It wasn’t uncomfortable, but it did feel like I had a decent size computer on my head, which of course I did. Sam suggested I move the strap around a bit. After somem finagling, I figured this was as good as it was going to get. It didn’t feel like it was going to fall off. It just felt front-heavy, like the top-heavy feeling I get when my bourbon-belly body is on rollerblades. I did a search, and the Quest 3 is around 515 grams, while the Vision Pro is around 532.

Moving my fingers around I also found the digital crown to be too small. I would use this control device later, and I have to tell you, it needs to be bigger. When you can’t see something, and you want to do small movements with it, and it’s already small, it’s frustrating. Yes, it’s cool, and it fits with the Apple ecosystem, but this needed to be adjusted.

The Digital Crown.

Now, the quality of the materials is top-notch. The strap was incredibly comfortable and disappeared as I used the product. Everything looks clean and precisely engineered. Even the carrying case looks like a high-end The North Face affair. The heaviness did not disappear, however.

The Demo

I should have mentioned earlier that Sam explained she could see everything I was seeing. She had an iPad wirelessly streaming the feed from the Vision Pro. She also had an iPhone that appeared to have the demo script. It was clear Apple wants this demo staged and not free-form. When I would pinch and zoom or move a window before being prompted, Sam would gently verbally nudge me with “Please wait for me.” Sorry, Sam!

First things first was the setup mode. The Vision Pro walked me through mapping my touching and vision. The vision part was interesting – I had to look at each dot on the screen and then pinch my fingers together to “tap” it. Moving my eyes, not necessarily my head, would move an invisible pointer. At the center of my vision – what I’m looking at – becomes what’s selected. It was also incredibly clear and vibrant – so whatever the Zeiss and vision calibration did, it did it well.

The experience is also fascinating from a UI and UX perspective. The center stays focused while items in my “peripheral” vision go out of focus when I move my head, coming into slightly better focus when I stop. In practice, this worked very well. However, the selecting and tapping part was not 100%. I’d say 3 out of 10 times – 30% – when I tried tapping something, the Vision Pro wouldn’t register the tap. Perhaps my hand was under the demo table and I didn’t realize it – but moving my hand closer to the device or further in front of me seemed to solve the issue. I also had to ensure I didn’t lose focus on what I wanted to tap, or I would “miss” or tap the wrong item. After some time using this, I’m sure it would become natural. For the most part it was – but it was clear after 30 minutes Apple has some tweaking to do in its UX, and I can see why this is a scripted demo. But still, damn, it’s amazing.

Once setup was complete, the Apple logo appeared, and I was greeted with the Home Screen. Yes, it looks like the typical iOS home screen layout, just in front of you with your surroundings in semi-transparent fashion in the background. You can tune out your surroundings by rotating the digital crown. I was only allowed to use on of the virtual backgrounds. Sam wouldn’t let me play with others, and she could clearly see via her iPad if I broke the rules. What I did experience, though, was a calming lakeside landscape. It even started raining while I was “there” and that was quite cool, and would have been calming had I not been in the middle of an Apple store. The speakers were loud enough for me to hear the raindrops, but I wasn’t there for that experience. Before you ask – no, I didn’t get a chance to set up the see-through mode that shows my eyes. That’s not part of the demo.

There are three basic gestures on the Vision Pro: Tap, Tap and Drag, and Pinch/Pull to Zoom. The first two are single-hand gestures performed with your dominant hand. The latter requires both hands, and gives you the feeling you’re in [insert Sci-Fi movie here] and manipulating virtual screens in the clear space in front of you. Yeah, it’s pretty cool. Another verbal wrist slap from Sam for me getting ahead of the game.

Demo 1 – Photos

The first demo was launching the classic Apple Photos app. There were many to choose from. Some were “flat” while others had the 3D depth old 3D Android phones were capable of many years ago. Remember the HTC Evo 3D? The flat photo was, well, a photo, and I could zoom in and out as expected. It was perfectly clear, and the colors were sharp and realistic. The 3D photo had true depth, and was shot on an iPhone 15 Pro. Both the 15 Pro and Pro Max support creating 3D and immersive content for the Vision Pro. Apple’s pushing those devices as content creation catalysts, understandbly. Because it was a scripted store demo, I didn’t ask for additional details like format support and technical details. My understanding is other 3D formats are supported, so you’re not limited to just Apple ecosystem solutions.

Demo 2 – Videos

Now for the fun part – video. There was no demo of a flat video here, and that’s fine. Who cares? Every headset does it. You’re not spending $3,500+ for a simple movie theater. There were two demos – one 3D video that wasn’t immersive, meaning it didn’t surround you, and another immersive sports video. The 3D video was cool – a family blowing out the candles on a cake. The frame rate seemed low, maybe 30fps, and reminded me of 3D video from those old 3D Android phones I talked about. It was neat that it was “large” in front of me, but it wasn’t mind-blowing due to having seen it before. Now, I’d like to know if the Quest 3 can do the same. Sam did not appreciate that I played the video more than once. To be fair, she had a lot of patience with me – thank you Sam!

The real treat was the immersive video compilation. It had many immersive videos, all being narrated by someone telling me how great “living the action” is. One was shot with a 360 degree (I think) camera placed on a soccer goal and I could see the game and the ball being kicked into the net. Another was a mountain landscape and I was watching the climber. Another was shot behind first base during a double play. You get the point – incredible action sequences to make you feel like you are there. And it did. It was exhilarating. I recall Sam explaining it was all 8K video. I asked if the screens themselves were 8K, but she wasn’t sure. The detail was phenomenal. Absolutely stunning.

Is there a new market here?

My first thought was Apple TV Plus – what if they started offering this type of content? Is that where it’s headed? I don’t know if it’s viable. Many of you may remember the many, many, MANY failures of trying to bring 3D into the home. Projectors, TVs, special glasses – and the fact 30% of human beings can’t watch 3D content without getting nauseous – it never worked. But they also didn’t have the content, other than more expensive 3D versions of Blu-ray discs. Could Apple stream this type of content? Could they convince people to wear these headsets while watching events such as concerts? I’m not convinced about sports, as I can’t see a bunch of people wearing headsets and drinking beer… Now that I’d like to see. If people generally look funny in VR, that would be a hoot. My point is, Apple certainly has the market position and technologies to make something happen here. What, I’m not yet sure… And Meta may be willing to play ball. If the monopoly regulators have their way, it may be a perfect match…

Demo 3 – Web Browsing and Compatible Apps

The last demo was showing that I could browse the web (Safari, yay?) and run “compatible apps” from the App Store. Meh. It’s iOS, so no surprises here. Cool, but no compelling killer app. The demo app Sam wanted me to run was a cooking app. I won’t be wearing a $3,500 + tax headset near the stove.

The Missing Demo

The Vision Pro content demos were impressive, to be sure. But where was the killer app to sell me on this $3,500 device? Sam kept telling me how this was a “Spatial Computing” device. But never did I see an example of spatial computing. I saw spatial consuming but not spatial creating. I would love to see the results of a survey of the 250,000 purchasers of this product explaining why… and what their income bracket is.

Final Thoughts

I took the Vision Pro off my head and handed it back to Sam. I did this the proper way… thumb under the nosepiece and four fingers on top. I thanked her for the experience and agreed it was quite impressive. I asked how many of these they sold each day. She couldn’t say, other than some people come in and simply buy one outright, no demo needed. It wouldn’t have been fair to ask her why – she’s just selling the unit, and knows fervent Apple fans with an Apple Credit Card are often willing to buy more Apple products (I jest).

But after the demo, I had no incentive to purchase the unit. There was nothing about it, at least during this entertaining 30 minute demo, that left a compelling reason on the table. Certainly not one that made me go “Gosh, I wish my Quest 3 did that!” I do need to determine if the Quest 3, at 1/6 the price, can do 3D video (UPDATE: It does!). But the Vision Pro demo was all about content consumption, and the Quest 3 does effectively the same thing in spades. Oh, and I can play VR games made specifically for its platform (noticeably absent from the Apple demo, but also understandable given the time constraints).

I also left with the feeling of possibility. What the Vision Pro represents, and what could come from such technology, finally, in the content consumption space. And maybe, eventually, in the content creation space, if Apple’s professional applications arm releases whatever they’ve got cooking. Who cares what you call it – spatial computing, VR, or otherwise – if you build something truly compelling.

Either way, the demo was worth it, I got my technologist buzz and my analyst gears working, and still have $3,500 to spend on something else.

I recently purchased an LG UltraWide 21:9 display. Why an ultra-wide and not 4K? a) I write code for a living and this is a great way to get two windows full size side-by-side without an extra monitor. b) It was only $130 instead of $400 due to a Best Buy sale. Sold!

I get home, connect it to my Surface Book, and nothing works. The screen just blinks on and off, on and off, blinkety blink, blinkety blink. No Bueno. Changing the cable made the blinkety blink go away, but the display control panel would suggest trying different settings, and wouldn’t light things up.

My friend Shane recommended I get an Active MiniDP to HDMI adapter. So I bought one on Amazon. Still, I didn’t want to wait… that’s 2 days with Prime shipping, and not fast enough.

So, I looked into the MiniDP adapter I was using. I found out it only supports up to 1080P! Maybe it’s DP 1.1 or something. Whatever it is, it couldn’t support a 2560×1440 or thereabouts display.

Fry’s had the answer – I made sure I found a MiniDP to HDMI adapter that clearly stated it supports 4K and MiniDP 1.2. The particular product I purchased was the Cirago Mini DisplayPort to HDMI Display Adapter.

I got home and the adapter worked flawlessly.

I hope that helps anyone having a similar issue!

 

Below are my notes from Day 1 of the CEATEC show in Makuhari, Japan.

SAM_8159

Sony Info-Eye + Sony Social Live

Sony showcased two unique social apps, Info Eye and Social Live, part of their Smart Social Camera initiative.

SAM_8244

Info Eye takes an image and analyzes it for different types of information, surfacing that information in different views. For example, take a photo of the Eiffel Tower and you are presented with different "views" of the data. The first view may be related photos of the French attraction, such as a view from the top, or the Eiffel Tower Restaurant. Change views and you’re presented with a map of Paris. Continue to the next view and see your friends’ comments on social networks about the attraction. It certainly is an innovative approach to automatically get benefits from simple photo taking – photos you normally wouldn’t look at again anyway.

A video is worth thousands of pictures, and you already know what those are worth:

And in case you simply want a picture:

SAM_8249

Social Live is like a live video feed, streamed from your phone to various social services. While the example of a live marriage proposal wasn’t so realistic, Social Live still has great consumer applications. For example, set a social live link on Facebook and your friends could view your video feed while you tour the Champs Elise in Paris, without your needing to initiate a Skype call. It’s similar to having a live broadcast stream ready to go at any time.

3D 4K Everywhere!

3D didn’t entice the world – again – so, why not re-use all that marketing material, swapping 4K for 3D? No, it’s not that bad, and 4K is beautiful, but it’s just too early, too expensive, as is almost every evolutionary technology like this. Just for fun I  made a collage of the various offerings. Component innovation is once again creating products at a pace greater than the consumers’ willingness to adopt.

4K_AutoCollage_12_Images

Tizen IVI Solutions at Intel

Intel had a sizeable display of Tizen OS based In-Vehicle Infotainment solutions at its booth. Apparently Intel had 800 developers working on Tizen while partnered with Nokia on the OS-formerly-known-as-MeeGo. The most interesting Tizen demonstration was Obigo’s HTML5-based IVI solution. On a related note, Samsung is apparently folding their Bada OS into Tizen. It will be interesting to see whether it makes any difference in the global mobile OS movement, still dominated by Android, then iOS, then Windows Phone.

SAM_8250

Obigo’s HTML5-based In-Vehicle-Infotainment Solution

Obigo’s solution is to automotive application development what PhoneGap is to standard mobile application development. Developers build widgets using HTML5 + JavaScript, accessing vehicle data and services via an abstraction layer provided by the Obigo engine. Apps in Obigo’s system are called widgets. Nothing appears to prevent Obigo from bringing this solution to Android, so look for that possibility on the various Android vehicle head units coming to market. Hyundai and Toyota will be the first integrators of the system.

SAM_8213

Apparently Japanese Car Insurance is Very Expensive

Another solution shown at the Intel Tizen display was a driving habits monitor capable of sending an email to your insurance company with said information. The goal would be to lower insurance rates. The solution was a hokey implementation at best, but at least I’ve learned insurance is expensive here as well.

Fujitsu Elderly Care Cloud

In an effort to keep Japan’s increasingly elderly population in touch with their families, Fujitsu has created a "Senior Cloud." The benefit to seniors will apparently be video and photo communication and sharing services with their family, alongside healthcare detail sharing services. I couldn’t get a demo, but it sounds like a good idea. For the next 10-20 years, anyway – by then, the "elderly" will have become the people who know how to do these things.

SAM_8221

ModCrew iPhone Waterproofing Coat

ModCrew displayed a nano-coating solution for iPhones (only), rendering your fruit phone washable.

clip_image001

clip_image002

Omron Basil Thermometer with DoCoMo Health Smartphone App

Omron has a unique line of basil thermometers, with pleasant shapes and colors, targeted (obviously) towards women. The devices, among other Omron health device solutions, can all transmit their data via NFC to phones and tablets. Using an app from NTT DoCoMo, health data can be consolidated and analyzed, and health advice can be provided.

clip_image003

All health components gather data to recommend healthy choices.

clip_image004

Huawei Phone with Panic Alarm

Chinese consumer and mobile electronics provider Huawei showcased their HW-01D feature phone with a built-in panic alarm. Targeted towards women, children, and the elderly, the device has a pull tab that sets off a loud, yet oddly pleasant, siren to scare away would-be perpetrators.

SAM_8252

Fujitsu Finger Link

Fujitsu’s Finger Link solution uses a top-mounted camera to convert physical objects to virtual objects, enabling you to organize and relate such items for later manipulation. For example, put 3 Post It notes down and they are converted to digital representations, automatically recognized as separate objects. Tap each related item and drag a line between others similar to the first. Tap a button on the projected interface and now they’re related, moveable, sharable, and more.

clip_image006

Fujitsu Sleepiness Detection Sensor

A hot item in vehicles displayed at CEATEC this year was detection of distracted driving. Fujitsu’s component detects eyes moving away from the road, a downward or upward motion possibly signifying the driver is drowsy. The component is for use by automotive integrators.

clip_image007

Fujitsu big data + open data quiet service, LOD utilization Platform

Fujitsu showcased an open LOD utilization platform for quickly and easily mining and analyzing the data from many Open Data sources all at once, visually. The back-end is using the SPARQL query language.

clip_image008

Mitsubishi 4K LaserVue

Mitsubishi showcased a prototype 4K Red Laser + LED backlit display, enabling a beautiful, beyond photorealistic video display. Standing in front of the reference unit, I actually felt like I was looking through a window – the colors were amazingly vivid and lifelike.

SAM_8267

clip_image010

Mitsubishi elevator skyscraper swap detection system

Mitsubishi also showcased a solution for preventing elevator stalls in swaying skyscrapers. Their sensor moves the elevator cart to a non-swaying or less-swaying floor to prevent service outages, keeping the elevators running as efficiently as possible, and giving you one less excuse to miss that meeting.

clip_image011

Mitsubishi 100Gbps optical transmission technology

Mitsubishi showcased a 100 gigabit/second inter-city optical interconnect solution, with a range up to 9000 kilometers.

clip_image012

Mitsubishi Vector Graphics Accelerating GPU

Who says you need multi-core ARM processors running over 1 GHz + powerful GPUs for beautiful embedded device interfaces? Mitsubishi sure doesn’t. They showcased a GPU running at a scant 96 MHz, accelerating vector graphics display at up to 60 frames per second. Incredibly responsive interfaces for elevators and boat tachometers were displayed. The target is rich user interfaces with incredibly low power consumption.

Related notes:

SAM_8265

Mitsubishi Rear Projection Display for Automotive

It’s no surprise Mitsubishi is proposing rear projection solutions for automotive – RP is one of the company’s strengths. What they propose is curved surfaces to provide an interface that matches the interior of the vehicle. Also possible is 3D-like interfaces, as shown below.

clip_image013

Sharp Frameless TV Concept

A display with no bezel? Sharp’s frameless concept showcases how beautiful such a solution would be. That it in the center.

clip_image014

Sharp Mirror Type Display

Also on display (ahem) was the Mirror Type Display, with a display built into a mirror. Have I said display enough times?

Pioneer Wireless Blu-ray Drive

That shiny new ultrabook is pretty svelte, isn’t it? What’s that? You want to watch a Blu-ray? That’s fine – just use Pioneer’s BDR-WFS05J solution to wirelessly connect to the Blu-ray drive across the room and stream the data over 802.11N, as long as it’s in its dock. The unit also supports USB 2 and 3. Ships at the end of September.

clip_image015

Toyota Smart Home HEMS Using Kinect

Toyota showcased a smart home energy management system (HEMS) using Kinect to interact with various residents.

Toyota Concept Vehicles

I don’t know much about the following one-person electric riders, but they looked cool, so enjoy the photos.

clip_image016

clip_image017

Clarion Smart Access + EcoAccel

Determining whether you’re driving Green, or "Eco" as they say in Japan, can be difficult. Clarion’s EcoAccel app, which runs on their Android-powered head unit, reads ODB2 sensor data to rate your Eco driving habits. It’s an entertaining way to enhance the eco-friendliness of your driving routine. The representative said there are no current plans to bring this product Stateside, but I’m hoping they change their mind. After all, ODB2 data is pretty easy to read, even if it’s not entirely standardized.

clip_image018

clip_image019

clip_image020

Mazda Heads Up Cockpit

While the HUD component is nothing to write home about, Mazda’s approach of keeping everything at eye level, while re-organizing the shift knob to also be easily manipulated was a welcome safe-driving-meets-ergonomics approach. Better yet, they will be shipping this in their Axela vehicles, meaning less expensive vehicles may be readily receiving technology to deter distracted driving. They call this the Heads Up Cockpit with a Concentration Center Display.

clip_image021

clip_image022

clip_image023

Mazda Connect System

Mazda also showcased the Mazda Connect system, enabling car communication and software components to be "easily" upgraded as new features are available. Whether this will be an insanely expensive solution, akin to Samsung’s upgradeable TV approach, remains to be seen.

It’s fascinating to see how some of the most innovative products are coming from what used to be one of the least innovative industries: automotive.