• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Augmented and virtual reality to disrupt mobile by 2020

"Cloud computing," as it's called now, is little more than the revival of the pre-1980s mainframe mentality...and it's a lot more financially attractive to monetize data stored on a corporate server than sell people solutions for setting it up at home.
On the money as far as I can see. In 20 years computers won't even function without Internet connectivity. Instead of buying software you then own, you'll have to subscribe to it for a monthly rate. "Monetizing" is indeed the term: While the geek priesthood is good at what they do, they don't have your best interests at heart. They want you in a dependent position.
 
In 20 years computers won't even function without Internet connectivity. Instead of buying software you then own, you'll have to subscribe to it for a monthly rate.

What do you mean "in 20 years"?
Hello Adobe, hello Microsoft Office.
 
Hello Adobe, hello Microsoft Office.
Which is the start of the trend. But I can see a day when you won't be able to type a Word document on your own. Microsoft won't let Word off its servers at any time, and when you want to use Word, the server will download only the needed code and then erase it as soon as each command or set of commands is completed. I assume bandwidth constraints have limited Microsoft's ability to implement such a regime up to now; these are rapidly disappearing.
 
Hello Adobe, hello Microsoft Office.
Which is the start of the trend. But I can see a day when you won't be able to type a Word document on your own. Microsoft won't let Word off its servers at any time, and when you want to use Word, the server will download only the needed code and then erase it as soon as each command or set of commands is completed. I assume bandwidth constraints have limited Microsoft's ability to implement such a regime up to now; these are rapidly disappearing.

There's always Linux!
 
"Cloud computing," as it's called now, is little more than the revival of the pre-1980s mainframe mentality...and it's a lot more financially attractive to monetize data stored on a corporate server than sell people solutions for setting it up at home.
On the money as far as I can see. In 20 years computers won't even function without Internet connectivity. Instead of buying software you then own, you'll have to subscribe to it for a monthly rate. "Monetizing" is indeed the term: While the geek priesthood is good at what they do, they don't have your best interests at heart. They want you in a dependent position.
Ever picked up a Google Chromebook? It requires a connection for pretty much any application.
 
http://techcrunch.com/2015/04/06/augmented-and-virtual-reality-to-hit-150-billion-by-2020/
I'm sure we'll hear from the same people who said the cloud wouldn't take off and that 4K was going nowhere.:lol:

Cloud didn't take off. It merely landed in our collective laps when developers decided to dump it on us and then charge us a fee for using it. Most people I know only use cloud services as a convenient backup (an alternative to external hard drives) primarily because they've learned to expect their network connection to fail at the most inconvenient times and nothing sucks harder than not being able to access your documents just because you can't find a hotspot that doesn't run like molases.

There's also the fact that I originally started using Dropbox PURELY so I wouldn't have to keep emailing shit to myself from my lab station at school. In that sense, we've ALWAYS had cloud storage, we just didn't always call it that.

I'd lump VR/AR in the same category as 3D and video phones. The technology has been around for a long time, and it's an exciting novelty to play with. But when the novelty wears off, very few actually adopt the new technology for everyday use.
It's also worth remembering that the people that need to adopt the technology in order for it to really become common are the "selfie/snapchat/facebook" crowd whose technology usage is primarily driven by narcissism and the need to draw attention to themselves. They want technology that will help display themselves to the world, not that will display the world to them. VR has much the same problem unless developers start finding ways of producing hyper-realistic avatars based on the person using the VR set.

As for 4K, more is not always better. Digital signal processing and transmission did more to improve video than resolution. That is, even contemporary video that is below "Standard Definition" looks vastly better than pre-digital systems.
There's also the fact that 4k is tremendous overkill for most of the programming people watch on TV in the first place. It's almost understandable if you're watching, say, Prometheus on a 90 inch TV in your home theatre system. But then your kids turn on Cartoon Network in the middle of the afternoon and you notice that Adventure Time in 4k isn't all that impressive after all.

I'm thinking it's a lot like wine snobs: Most of the people who say that they can tell the difference are actually lying to themselves.


Virtual reality is for folks who can't handle actual reality. ;)

Who is the Internet for, then?
People who like porn.
 
Hello Adobe, hello Microsoft Office.
Which is the start of the trend. But I can see a day when you won't be able to type a Word document on your own. Microsoft won't let Word off its servers at any time, and when you want to use Word, the server will download only the needed code and then erase it as soon as each command or set of commands is completed. I assume bandwidth constraints have limited Microsoft's ability to implement such a regime up to now; these are rapidly disappearing.

Which is interesting, because when we replaced the old office computer in here it took us 7 hours just to download and install Microsoft office because their damn website kept glitching. We finally gave up trying to download it and wound up going out to buy the software from Best Buy... only to discover that you don't buy software anymore, you buy a download code and still have to use their glitchy-ass website and hope for the best.

This, plus one truly heartbreaking day of "server error" that inexplicably lost all my gamesaves on the Xbox cloud drive, explains why I and most people don't trust Microsoft with anything more important than "emergency backups."

e39acaf436d75856adfed3cade2d7c57c1e55ea39695ba89c6ca745a5a53aa99.jpg
 
Also we have multiple terabyte hard drives to full and in we use the cloud will not fill. I have a terabyte dive that replaced two 300 gigabyte drives. I still haven't filled 300 gigabytes of space on the new drive.
 
I still don't understand putting major software packages that many large companies depend on in the cloud. Many of them need to use the software in classified or completely closed off networks, and *can't* connect to the internet. Not to mention the many users that won't be connected 24/7.
 
I'm sure we'll hear from the same people who said the cloud wouldn't take off and that 4K was going nowhere.:lol:

When you say "cloud" do you mean the Internet in general, or specifically storing and backup of personal files on "cloud" servers? Such storage can chew up bandwidth very rapidly, especially if one has a data capped plan. Also, lots of people simply do not trust cloud services with potentially sensitive material.

But it's high tech! That's reason enough to use it! Singularity! :scream:

I'd lump VR/AR in the same category as 3D and video phones. The technology has been around for a long time, and it's an exciting novelty to play with. But when the novelty wears off, very few actually adopt the new technology for everyday use.

As for 4K, more is not always better. Digital signal processing and transmission did more to improve video than resolution. That is, even contemporary video that is below "Standard Definition" looks vastly better than pre-digital systems.

In my opinion 4K/UHDTV is overkill for the living room. I imagine it will become commonplace one day, by default, but right now the increase in resolution is not so striking that it will leave 2K/HDTV in the dust. I'm talking from experience—seen it, worked with it. (VR/AR, too.)

These are the same claims stated when i originally posted about the growth of the cloud and of course, they are demonstrably wrong.

http://www.trekbbs.com/showpost.php?p=10586659&postcount=109

Again wrong, 4k is exploding. I've seen it just like you and its noticeably better in a living room.

http://www.digitaltrends.com/home-t...path-similar-to-hd-growth-pattern-but-faster/

nuff said
 
Uh, bandwidth relies on physical infrastructure, which means it is finite. By definition.

Everything physical is finite. What the hell are you talking about??
 
Uh, bandwidth relies on physical infrastructure, which means it is finite. By definition.

Everything physical is finite. What the hell are you talking about??

Well I was exaggerating for effect, hard to tell in text. :techman:

Suffice it to say there is no real limitation on the horizon in either use or bandwith, and availibility is only a speed bump, not a hinderence.

RAMA
 
I'll be sure to note that next time Verizon tries to get me to pay for more bandwidth.

"Some guy on the Internet said bandwidth is effectively unlimited! You guys are scamming me!"
 
I'll be sure to note that next time Verizon tries to get me to pay for more bandwidth.

"Some guy on the Internet said bandwidth is effectively unlimited! You guys are scamming me!"

And yet your bandwith is several levels above what it was 5 years ago no doubt. You get what you pay for too. :techman:
 
I'm sure we'll hear from the same people who said the cloud wouldn't take off and that 4K was going nowhere.:lol:

When you say "cloud" do you mean the Internet in general, or specifically storing and backup of personal files on "cloud" servers? Such storage can chew up bandwidth very rapidly, especially if one has a data capped plan. Also, lots of people simply do not trust cloud services with potentially sensitive material.

But it's high tech! That's reason enough to use it! Singularity! :scream:

I'd lump VR/AR in the same category as 3D and video phones. The technology has been around for a long time, and it's an exciting novelty to play with. But when the novelty wears off, very few actually adopt the new technology for everyday use.

As for 4K, more is not always better. Digital signal processing and transmission did more to improve video than resolution. That is, even contemporary video that is below "Standard Definition" looks vastly better than pre-digital systems.

In my opinion 4K/UHDTV is overkill for the living room. I imagine it will become commonplace one day, by default, but right now the increase in resolution is not so striking that it will leave 2K/HDTV in the dust. I'm talking from experience—seen it, worked with it. (VR/AR, too.)
Just like with touch screens, it's about performance, usability and content. No one cared about tablets until the iPhone and iPad came along, which combined high performance with good usability and lots of content. Oculus/VR goggles are only an inch away from taking off, there is a lot of potential in games, and a lot of other use cases. It still lacks in latency, resolution, and in the usability/user experience. AR glasses are further away. Epson Moverio has been a big step forward, with a large stereo display, but the performance is horrible and the trackpad is terrible. Microsoft Hololens looks promising, but it depends if the execution matches the concept video. The industry is longing for VR and AR, especially AR, for marketing, production and technical documentation.
It's taking off.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top