Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
GUI Software Technology

The User Experiences Of The Future 230

Patrick Griffin writes "The way that we interact with technology is almost as important as what that technology does. Productivity has been improved greatly over the years as we've adapted ourselves and our tools to technological tasks. Just the same, the UI experience of most hardware and software often leaves novice users out in the cold. The site 'Smashing Magazine' has put together a presentation of 'some of the outstanding recent developments in the field of user experience design. Most techniques seem very futuristic, and are extremely impressive. Keep in mind: they might become ubiquitous over the next years.'"
This discussion has been archived. No new comments can be posted.

The User Experiences Of The Future

Comments Filter:
  • by PIPBoy3000 ( 619296 ) on Tuesday November 27, 2007 @11:47AM (#21492735)
    They really seem to be pushing 3D interfaces in the article. While that's neat and all, I suspect there's a reason not every book is a pop-up book. Flat, 2D representations of data are typically the most efficient for our brain and eyeballs. For entertainment and representing 3D data, it can make sense. I just don't plan on coding in 3D any time soon.
    • by TheSpoom ( 715771 ) * <slashdot&uberm00,net> on Tuesday November 27, 2007 @11:58AM (#21492873) Homepage Journal
      I suspect we'll have 3D interfaces when it becomes cheap to manufacture displays that can actually project a 3D interface. Screw 2D projections of a 3D world, I want my VR!

      Speaking of which, the future needs the following three Star Trek items to solve everything all at once:
      • Teleporters (solves all transportation issues)
      • Replicators (solves hunger)
      • Holodeck (solves sexual ten... I mean, makes simulation much easier. Yes, that's it)
      So seriously, science, it only took you like twenty years to catch up to the first Star Trek, what the hell?

      *mumbles indistinctly about his flying car*

      • by Selfbain ( 624722 ) on Tuesday November 27, 2007 @12:06PM (#21492977)
        The Star Trek version of a teleporter is essentially a suicide booth. It rips you apart and then makes a copy on the other end. Do not want.
        • by TuringTest ( 533084 ) on Tuesday November 27, 2007 @12:13PM (#21493081) Journal
          You wouldn't notice when you've been terminated, and the other copy would still think that HE is YOU. So how would you tell it? And why should you care?
          • by Selfbain ( 624722 ) on Tuesday November 27, 2007 @12:16PM (#21493139)
            Arthur: I'd notice the difference. Zaphod: No you wouldn't, you'd be programmed not to.
          • by Lijemo ( 740145 ) on Tuesday November 27, 2007 @12:42PM (#21493491)

            How would we know it didn't happen this way:

            How do you know you wouldn't just experience being painfully killed: poof, bye-bye, assume afterlife, nonexistence, or reincarnation, depending on your beliefs.

            Meanwhile, the copy of you with all your memories (or, all from before the "teleporter") doesn't realize that you have experienced death-- or even that s/he isn't you but a copy. It would be the same to everyone you know-- they wouldn't be able to tell that you'd been replaced by a dopoulganger. Your replacement, not knowing any better, would assure everyone that the process was completely safe and painless, and that "you" came to the other end just fine.

            The only person that would know the difference is you, except you're not around anymore to know or tell. You're dead.

            I'm not sure how one would test a teleportation system to see whether the person going in actually experiences coming out at the other end, or whether the person going in experiences death, and a copy at the other end doesn't know the difference. Or at least, how one could test it and relay the results to others.

            Then we can further complicate the question: suppose that you die due to reasons unrelated to teleportation. And you last used a teleporter about a year back, but the teleporter saved your "pattern"-- so your grieving loved ones are able to "recreate" you, exactly as you were when you came out the teleporter-- the only difference is that you'd be confused as to how a year had passed since you'd gone in, and everyone else has memories of you during that time that you didn't experience. Is that you? Or not?

            • Re: (Score:3, Informative)

              by Z34107 ( 925136 )

              Then we can further complicate the question: suppose that you die due to reasons unrelated to teleportation. And you last used a teleporter about a year back, but the teleporter saved your "pattern"-- so your grieving loved ones are able to "recreate" you, exactly as you were when you came out the teleporter-- the only difference is that you'd be confused as to how a year had passed since you'd gone in, and everyone else has memories of you during that time that you didn't experience. Is that you? Or not?

              • Re: (Score:3, Informative)

                by Rolgar ( 556636 )
                Well, there was the episode where they found another Riker where he'd been successfully teleported out of a dangerous situation but a copy was accidentally left on the planet. Season 6, Episode 24, Second Chances
                • That was just technobabble to give Riker an Evil twin.

                  It's almost as good as the separated at birth and hidden by his parents story.
              • Also, if you have a cup of coffee and transport through the transporter do you lose all your caffeine? Replicators are meant to just be transporter technology that assembles food from basics held somewhere in the ship. Or at least that's what I read a few years ago, I'm not really a trekky any longer *hides*
            • by Nullav ( 1053766 )
              I can't see that being much of a problem unless the subject subscribes to dualism. Well, except in the 'resurrection' example, in which case it would be confusing as hell for the copy.
            • Reminds me of The Prestige [imdb.com], only in the movie, the duplicated subject had to be killed after the fact.
            • Incremental backups would fix that right up!
          • by Creedo ( 548980 )
            And why should you care?
            Well, it might just be me, but I wouldn't want to be erased just to put a copy of me somewhere very quickly.
            • I hate to break this to you, but this has already happened to you several times, albeit more gradually. Most (if not all) of your molecules have been replaced continuously throughout your life since you've been eating, drinking, breathing, sweating, shedding, growing, shrinking and using the bathroom (sorry, I didn't want to say "peeing and pooping" because it sounded to juven--well, shit).
              • That's not true of your memory though. Your memory-containing neuron cells aren't replaced. When a neuron that contains memories dies, those memories are dead. It doesn't transfer them to new ones.
                • Re: (Score:3, Interesting)

                  I'm no neuroscientist, but I'm of the understanding that individual neurons don't contain memories. Those are believed to be encoded through the vast interconnectedness of many individual neurons and when one neuron dies, the rest route around it so nothing is necessarily lost. Some new neurons are created throughout our adult lives and even a neuron that's been with you since birth will have had most of its molecules completely replaced several times. The original DNA molecule's probably still in there-

              • by Creedo ( 548980 )
                Ah, but the crucial difference here is a continuity of consciousness. I'd have to problems with having my whole body replaced, as long as my consciousness is not erased. Even sleeping people have continuity.
          • Why does your post make me think of Calvin's transmogrifier?
          • Re: (Score:2, Insightful)

            by Parasome ( 882135 )
            I read a thought experiment by, I think, Arthur C. Clarke that went something like this: suppose you're an astronaut, you are stranded on Mars with your spaceship wrecked, but your teleporter is still functional, so you can beam back home. Unfortunately, the part of the process that erases the original does not work. So you will return back to your loved ones and live happily ever after, and simultaneously die a miserable death alone on Mars.

            Well...

          • You wouldn't notice when you've been terminated, and the other copy would still think that HE is YOU. So how would you tell it?

            I wouldn't be able to tell because I'd be dead!

            And why should you care?

            Because I'd be dead!

        • No, it's not. There has been at least one episode that shows that, from the point of view of the transported person, consciousness is continuous.

          The treknobabble explanation has something to do with quantum mechanics, which, as every sci-fi fan knows, is magic.
        • "The Star Trek version of a teleporter is essentially a suicide booth. It rips you apart and then makes a copy on the other end. Do not want."

          Ok Dr. McCoy....you're views on the transporter are VERY well known by now...

          Let's move along now...

          :-)

        • This discussion reminds me of "A Matter of Bandwidth [sri.com]" by Lauren Weinstein, which appeared in the April, 1999 CACM. A memorable section of that article:

          Some early MT researchers had advocated omission of the final ``dissolution'' step in the teleportation process, citing various metaphysical concerns. However, the importance of avoiding the long-term continuance of both the source and target objects was clearly underscored in the infamous ``Thousand Clowns'' incident at the Bent Fork National Laboratory in 1979. For similar reasons, use of multicast protocols for teleportation is contraindicated except in highly specialized (and mostly classified) environments.

      • They all seem to work the same way... Which brings me to wonder.... If you configure a holodeck and cross reference it with a replicator. So you spend months or years in it eating holodeck food wich will work like real food while you are in the holodeck after a while you body and mass will become completely a holodeck creation, as your old cells die and new ones are created from virtual matter. So you then can save yourself on disk. Run backups of yourself incase you do something stupid. Or just turn yours
      • I can certainly see the holodeck as a boon to business. It would be nice to be able to virtually attend a meeting rather then drive across town, or crowd around a conference call. However, I suspect that the teleporter, and it's simpler cousin - replicators, are going to meet with a lot of resistance.

        Having a device that can create goods out of basic materials that are locally available would kill the current economy. Granted, we would still have to pay for power and patented "Replicator" data designs. How

        • Congratulations, you have just described what companies are doing with Second Life.
          It's a holodeck. You can do anything with it, including meetings, classes, roleplay, whatever.
    • by mikelieman ( 35628 ) on Tuesday November 27, 2007 @12:05PM (#21492961) Homepage
      I suspect we will find that the top percentile of expert users will instead eschew all the "innovations" and use a window manager like Ratpoison which presents each window as it's own FULL SCREEN entity, without lost real-estate to window borders, taskbars, and other widgets.

      It's a Zen thing, you just wouldn't understand.

      • Re: (Score:3, Interesting)

        by TuringTest ( 533084 )

        presents each window as it's own FULL SCREEN entity, without lost real-estate to window borders, taskbars, and other widgets. It's a Zen thing, you just wouldn't understand.

        Actually, the real breakthrough in user experience would be an interface allowing that kind of 'zen' without needing to be an expert user. The Humane Interface [wikipedia.org] was a step in the direction of such an interface, but its current proof-of-concept implementation is unfortunately not enough developed to live to expectations.

        • by jythie ( 914043 ) on Tuesday November 27, 2007 @12:32PM (#21493353)
          I think part of the problem in these various usability debates is that a good UI for learning and bringing in newbies is not the most effective solution once one has greater needs.

          This 'one size fits all' mentality is the issue. We need interfaces that scale from basic to advanced so the basic users doing get slammed with all the advanced stuff and advanced users don't find themselves without the tools they need to actually do their work.
          • I think part of the problem in these various usability debates is that a good UI for learning and bringing in newbies is not the most effective solution once one has greater needs.

            True, but why not? Just because we don't know how to do it work, not because it is a bad idea.

            That's why we need a breakthrough. A system both learnable for newbies and efficient for experts is the holy grial of user experience. It can't be done with current mainstream commercial toolkits (too based on the WIMP paradigm), but new technologies (like multitouch and gesture recognition) and new paradigms (like Programming By Example [wikipedia.org]) could be the way to build such complex systems.

            • Re: (Score:3, Interesting)

              by jythie ( 914043 )
              Unfortunately the two tend to be mutually exclusive.

              When we look at all these slick 'intelligent' interfaces that are newbie oriented, they all hinge on the computer figuring out what the user 'intends' to do. They work because they wrap up and automate the common cases, but in doing so they inherently limit the possible functionality.

              When one looks at these technologies, even things like Programming By Example, they are cases of automating the usage of the computer like an appliance. They tend to make li
              • Re: (Score:3, Interesting)

                by TuringTest ( 533084 )

                When we look at all these slick 'intelligent' interfaces that are newbie oriented, they all hinge on the computer figuring out what the user 'intends' to do.

                Yes, as so is the core of what usability is about. Though, note that a really usable system MUST let the user override the inferred 'intent' when it's wrong.

                They work because they wrap up and automate the common cases, but in doing so they inherently limit the possible functionality.

                The common cases MUST be fully automated. For the system to be efficient for experts, it must also automate the uncommon cases; there's nothing in that that prevents including a language for expressing both types of automation in the same interface. So the functionality is not inherently limited - the system wrapped for common cases could also be expa

            • Expert user interfaces are poorly understood because it is exceedingly hard to do research on them. If you want to test a new interface for newbies, you just put together a prototype, grab some random people off the streat, and do a user study. If you wanted to do the same thing for experts, you would first have to to train them for weeks, possibly even months or years, and THEN do the study. Apart from this process being very expensive, you would be hard pressed to find subjects that are willing to put in
    • I am not sure about this 3d Technology. It is a good step in the right direction but the issue of the semi transparent images is still a real issue. Right now it looks really cool because it it like starwars... But for normal use it will get old fast. If it can make solid looking objects too then we may have a way to start a good interface.
    • Never mind the fact that over the top physical desktop metaphors have never caught on in twenty years of being the "next thing".
    • Re: (Score:3, Insightful)

      by jibster ( 223164 )
      I humbly disagree with you. Our brains have clearly evolved for a 3D world. I believe the reason you believe 2D is more efficent is 3D has a very long history of not being done right. There's a good reason why that is. 3D is far more computationaly expensive than 2D and lacks a true 3D display and interaction device.

      I offer as evidance the spring and plastic ball models of modules, and the skelitons in the doctors offices.

      2D clearly has its place, but I expect 3D to start elbowing in on it as soon as th
      • Re: (Score:3, Insightful)

        by grumbel ( 592662 )
        ### Our brains have clearly evolved for a 3D world.

        From where did you get that? Our movement is for most part pretty much limited to 2D (forward,backward,left,right are good, but up and down are heavily restricted), the earth is flat (at least from a human point of view) and there really isn't all that much true 3D in our daily lives. Sometimes we stuck a bunch of 2D things into a hierarchical structure, but thats as 3D as it ever gets. Our eyes of course also only see 2D view of the world, sure a little de
        • by VE3MTM ( 635378 )
          If we would be build for 3D we wouldn't get dizzy when playing Descent, but quite frankly, most do.

          When you develop a version of Descent that stimulates the inner ear to match the ship's motion, and people still get disoriented, then get back to me.
      • As the other response to your post explains, the evolution to work in 3D is not all that it is made out to be. In fact, most people are really bad in navigating in 3D, unless they have trained doing it in video games or as pilots. I attneded a conference once where the speaker was making that point. He asked the audience to point toward their hotel room (we were on the second floor of a hotel tower). Pretty much people were pointing all over the place, and these were graphics/UI people, who you'd expect to
    • by hey! ( 33014 )
      Plus, we've already been down the road of user interface literalism -- it didn't go anywhere.

      Last time it was the adoption of systems with graphical capabilities that lead us down the road; when people started getting VGA instead of text interfaces, people thought it would be oh-so-usable to use a graphical representation of a desktop for the desktop. We're in the same boat now with 3D.

      The most useful UI ideas don't seem to have close real world analogs. The "window"? Where they have superficial real worl
    • The article isn't about user interfaces that make the interface actually more usable, it seems to be entirely about interfaces that are flashy and glamorous-- eye candy (and maybe, to a small extent, touch candy.) The main problem with user interfaces today is that they are bafflingly opaque-- about the only way to learn most user interfaces is to just press all the buttons in sequence and see what they do. I hate glitz; I want function. Has anybody actually ever though about figuring out what users actu
    • by vadim_t ( 324782 )
      How about this [compiz-fusion.org]? It looks pretty cool with the glasses.
    • I feel for people who are still trying to make sense of their database schemas without explicit Relationships and in 2D.

      I developed a schema and source code parsing technique for detecting Relationships in a well normalized database and then took the output of that (750+ Tables & 1,200+ Relationships) and developed 3D (VRML) presentation techniques to let me SEE the Tables and Relationships.)

      I've used it to see the Tables and Relationships in other clients' databases as well. It's a very useful techniqu
  • by gillbates ( 106458 ) on Tuesday November 27, 2007 @11:47AM (#21492737) Homepage Journal

    You'll be able to squeeze in a trip to Starbucks between reboots. And this in the early morning, rush hour traffic.

    Seriously, the most problematic part about today's user experience is that the majority of the computers run Windows, and more slowly than they did 20 years ago. Sure, you get nice, pretty graphics, but when you're actually trying to get work done, you'd rather have a responsive machine.

    • Re: (Score:3, Interesting)

      by capt.Hij ( 318203 )
      I would have strongly disagreed with your sentiment a short time ago but have changed my mind recently. The things in the article looked like fun but will have a hard time being accepted.

      The thing that changed my mind is that I had to install a machine with Vista on it, and it was my first experience with the new OS. The machine is a new dual core Intel with 1Gb of memory. It should be a screamer but is essentially the same as the 5 year old XP machine it replaced. The secretary who has to use the machine d
      • Re: (Score:3, Interesting)

        by cowscows ( 103644 )
        The problem with something like desktop linux is that (for the average user) the changes either don't show enough immediate benefit to make relearning worth while, or don't offer enough of a difference to make the change interesting.

        Using the Wii as an example as you did, the Wiimote is a pretty big change in how controllers work. Even if you don't see the potential of it right away, it's so different and a little bit wacky and so it's interesting enough that you want to give it a shot. But let's say that i
    • When i began worked for a wISP earlier in the year i quickly moved from using windows on my laptop to Linux. Id always wanted to try it out (and i always loved multiple desktops!) so I jumped right in.

      Something that is, to a lot of people, eye candy, proved very useful to me: the cube desktop. When in the field I didnt always have a place to put a mouse, and got very used to using the trackpoint on my thinkpad. With the desktop cube able to freely rotate, i could very quickly move away from what i was worki
    • Re: (Score:3, Informative)

      by ByOhTek ( 1181381 )
      The latest and greatest Windows is typically slow and clunky, but all things considered, what alternative would you pick?

      I ran Ubuntu on my notebook, next to FreeBSD and Windows XP. For responsiveness, it typically ran like this: FreeBSD/KDE > Windows > Kbuntu or Ubuntu >> Windows while virus scan was running.

      Given that it usually isn't in windows, the last entry is required.

      I wouldn't give the majority of users FreeBSD. There's MacOS, but my experiences with it (dual core Core2 cpu'ed machine),
    • Re: (Score:3, Insightful)

      by geobeck ( 924637 )

      You'll be able to squeeze in a trip to Starbucks between reboots.

      I was just thinking along those lines the other day, as I was waiting for a Facebook page to load. I made a few personal websites back in the early days of HTML, and my philosophy was that if my page took longer than 5 seconds to load, the viewer would hit 'Back' and go somewhere else. Nowadays I always browse in multiple tabs so I don't have to sit idle while each page loads--which can take close to a minute.

      I don't know what the user

  • by Entropius ( 188861 ) on Tuesday November 27, 2007 @11:51AM (#21492791)
    The metaphors we're using now work pretty well, and UI changes in the future will probably consist more of refinements of these rather than totally new things, at least until and unless there is a major advance in display technology.

    As an example of a well-engineered UI that can make otherwise extremely tedious tasks manageable: Google's Picasa photo manager. It manages to deal with huge amounts of data (3700x2600 jpg's or whatever 10MP comes out to, and 24MB RAW files), run quickly, and show you relevant stuff.

    The 3D rotating super+tab screen for task switching in Compiz is another example of using extra computing power to show something useful.

    Opera's introduction of mouse gestures is another good idea.
  • "Productivity has been improved greatly over the years"

    It has? Where is this increased productivity of which you speak?

    I see people doing things differently than they did years ago, but I would hesitate to call it increased productivity.
    • Re: (Score:3, Funny)

      by Selfbain ( 624722 )
      My ability to read slashdot at work has improved greatly over the years.
    • by kebes ( 861706 ) on Tuesday November 27, 2007 @12:26PM (#21493265) Journal

      Where is this increased productivity of which you speak?
      I think it's easy to miss the increased productivity because our standards rise very quickly with enabling technologies.

      For instance, I can sit down on my computer, grab dozens of scientific articles in a few minutes, write a summary of them, and have it typeset to publication-quality with a few clicks. I can then launch a professional-quality graphics art program to make a few figures. I then put it all together and send it to someone (who gets it within seconds).

      The same operation would previously have taken much more time and money, not to mention specialist talent. (E.g. numerous trips to library, typing and re-typing a manuscript, hiring a graphic artist to make a figure, and mailing the finished product would have taken weeks of time, hundreds of dollars, etc.) And I haven't even mentioned things that are inherently compute-bound (e.g. how long would it take to run a complicated simulation today vs. ten years ago?).

      In short, these technologies have enabled the individual to do things that previously only specialists could do, and have allowed everyone to complete their work faster than before. It's easy to dismiss this since the promised "additional free time" from increased productivity never materializes: instead we merely increase our standards of quantity and quality. Many people don't even see this as progress (e.g. many people would prefer handing off tasks like typing and typesetting to others, whereas nowadays the norm is for everyone to do this themselves).

      Nevertheless, the net amount of "stuff" that a person produces (documents, designs, computations, admin tasks completed, etc.) has indeed increased in breadth, quantity and quality, due to the use of computers, networks, and our modern clever user-interfaces.

      I, for one, am much more productive using a computer than I would be otherwise. And if anyone thinks that their computer isn't making them more productive, then I challenge them to try to complete daily tasks without it, and see how long/arduous things actually are without.
      • You say "these technologies have enabled the individual to do things that previously only specialists could do, and have allowed everyone to complete their work faster than before."

        That is not an example of being more productive,. Before these technologies existed, most people didn't *need* to do the things they now must do to complete their work.

        Today I have a computer with a 3 GHz processor and 2 GB of RAM to help me do my job. When I first started my career, I used to time-share some processor time on a
        • Your program from today is much more complex, does a hell of a lot more things, and, as you say, looks nicer. If you say you are taking just as long to produce it as the old program, then by definition, you are more productive. Your argument makes no sense. Just because the output in both cases is "a program" doesn't mean they are equivalent.
    • Oh, goodness. How can you even ask the question?

      Would you believe it changed the whole basis of financial economics: everyone now "get it" that the value of stock is independent of whether the company is doing anything useful or making a profit?

      Would you believe it created the dot-com revolution?

      Would you believe it sparked the endless bull market and gave ordinary Americans access to the secret of wealth without work?

      Would you believe it created 401(k)s growing at 20% per year and has made the average work
    • by c_forq ( 924234 )
      I know in my field computers have caused productivity to leap vast amounts, but we use most AS400 systems, and most all the office has thin-clients and VNC into the important stuff. Now I will give you that if we lose telephone or internet than the office grinds to a halt, and if the server goes down the warehouse can only work on already printed tickets, but those outages are rare and keep getting more scarce.
  • Where's Jeff Han's [ted.com] lightbox multi-touch stuff?
  • What about Touch Feedback?

    Ticker symbols IMMR and NVNT.OB (Novint Falcon sold @ CompUSA and supports Half-Life) come to mind.

  • by TuringTest ( 533084 ) on Tuesday November 27, 2007 @12:05PM (#21492963) Journal
    Those futuristic FX barely have to do with what the final user get as 'experience'. The real experience is about the feelings of the user.

    Unfortunately, the most common feelings provoked by today's interfaces are anger and frustration. That's because the interface is littered with rough/unpolished edges, and because software is designed as a bag full of (unrelated) features - instead of as a mean to achieve an end - the process to actually use a feature is rarely taken into the design, not to say tested with users to test it and debug it with the user using it.

    A really good development in user experience would be a way to force programmers to follow
    this kind of advice [joelonsoftware.com].
    • That doesn't really mean anything...

      What do people expect? A programmer expects something different from a layman. A person familiar with a certain mapping of symbol to action will expect something different from a person familiar with a different mapping.

      At the end of the day, we have to go back to the beginning, when everyone's expectations are roughly the same. That is to say, we have to go back to how we would expect something to happen before we learned through trial and error that things don't always
      • The good thing about computers is that we can redefine what doing a task requires. In your 'playing an instrument' example, thanks to computers it only requires one finger (to press the 'play' key of your MP3). With a two hands interface, a damn whole lot of tasks that are clumsy with a mouse will feel natural again (zooming?).

        Also, thanks that the computer remembers state, complex tasks may be decomposed into simpler ones, each one requiring just a simple interaction (again with a music example, that would
  • What I'd really like to see coming in my lifetime is a fully immersive cyberspace-like interface, but done via direct neural stimulaton/reception through some kind of cranium socket.
      I've seen on a TV programme a while back very simple versions of this implemented already, enabling a blind man to 'see' numbers via electrodes surgically implanted in his visual cortex. It would be amazing to scale it up to the full simstim thing as in Neuromancer though.
  • Am I the only one who sees that this is nothing more than a giant four-sided heads-up display?
  • User experience (Score:5, Insightful)

    by Alioth ( 221270 ) <no@spam> on Tuesday November 27, 2007 @12:15PM (#21493127) Journal
    AAArrrgh. User experience.

    I don't want a user experience. If I'm having a "user experience", then the application or operating system is getting in my way. I want the OS or app to melt into the background so I hardly think that I'm using it.
    • by jythie ( 914043 )
      Well put.

      This is actually a good example of why I like OSX (running with most of the silly stuff turned off). It stays the expletive out of my way and makes an efficient task switcher. Outside booting, launching the apps I want, and switching between them, I basicly do not interact with it. Which is nice.
    • If you are not operating the OS or application, what is it that you are operating?
  • by dpbsmith ( 263124 ) on Tuesday November 27, 2007 @12:21PM (#21493189) Homepage
    "Keep in mind: they might become ubiquitous over the next years."

    Why should I keep that in mind? Do I need to prepare myself mentally to compete in the brave new world? Do I need to worry that people who keep in mind that these interfaces might become ubiquitous will become so much better at operating computers than me that I'll become unemployable? Where can I find a community college course on how to play 3D video games?

    But, but, but: the fear factor. They might become ubiquitous over the next years. Maybe. And then again, maybe not.

    What if I back the wrong horse? What if I budget three hours a day to do exercises to hone my spatial perception skills to a scalpel-like edge, only to find that the real winners are those who anticipated the rise of olfaction-based user interfaces?

    Well, gotta go... time to do my PL/I programming exercises. PL/I, it's the wave of the future, y'know.

  • by hey! ( 33014 ) on Tuesday November 27, 2007 @12:29PM (#21493315) Homepage Journal
    It would be a good thing.

    (user interface techniques don't count as design)
  • I'd be willing to bet this article will be proven wrong. No, I did not RTFA. Where's that Randi guy?
  • by SmallFurryCreature ( 593017 ) on Tuesday November 27, 2007 @12:45PM (#21493541) Journal

    They link to a review of it, so here is my own. We accept for the moment that it will ONLY work with MS software and MS approved hardware.

    I put my MS approved camera on the surface, up pops a enormous windows telling me I got to agree to a eula (exactly what happens when you access MS media player for the first time), it then finally allows me to download the photo's. I then try to put them on my Zune 2.0, OOPS cannot do that, the camera is digital and zune only accepts analog (Zune 2.0 doesn't allow the uploading of movies captured with a digitial tv tuner, only analog tuners)

    Starting to get the picture? ALl these things sound nice when you just see the pre-scripted demo, but when it comes to real life, well, it all just breaks down. Especially when it comes to Microsoft.

    Same thing with multi-touch screens, very nice, but how much software will be written to make use of it when so few people will have such a screen? I remember that System Shock ages ago had support for 3D helmets, it was a hot topic back then and one that never happened. SS was one of the few games to support such systems, the others wisely did not bother since nobody had such helmets and because few games supported them, what was the point in getting one.

    I can make a game around the logitech G15 keyboard that makes the device indispensible to play, but I would be really hurting my changes of selling the game.

    All these devices are intresting enough, but destined to remain obscure simply because people won't be buying them unless their is a killer application for it, and nobody will build such an application until there is a larger installed base.

  • by petes_PoV ( 912422 ) on Tuesday November 27, 2007 @12:55PM (#21493671)
    All these "futuristic" interfaces fall foul of the "flying car" effect. In the past people expected that by now (well, by about 1980) we'd all have given up out automobiles for flying cars. These UIs are the computing equivalent - they take our current limited experiences and extrapolate them.

    In practice anything that involves waving your arms around, a la Minority report will be the fastest way to get tired arms ever invented. So that's the Reactible, Multi-touch and Microsoft surface out of the running. Imaging doing that for an 8 hour shift in your datacentre. Completely impractical, but like flying cars, looks great to the uninformed.

    Let's face it, typing is quicker than mousing - you've got 110 keys at your disposal instead of just 2 (or up to 5 - wow wee!!!) and the limitation is the number you can press is limited by the numberof fingers you can manipulate at once - not the numebr of things you can press. Just try writing a letter by mouse clicks. Typing is even quicker than speaking - especially when you have to go back and change the phonetically (sorry fonetically) spelled words that come out.

    Personally, all I want from a UI is one that doesn't steal focus from my window to pop-up a "Shall I delete all your files Y / n" just when I think I'm, going to hit in a text window. It should keep the clutter off my screen and just show me the stuff I want. Aeroglass is nowhere near this (and probably going in the wrong direction anyway - far too complicated). Let's just keep it as simple as possible, but no simpler.

    • Re: (Score:2, Insightful)

      by AndrewNeo ( 979708 )
      Ever heard of a virtual keyboard?
    • I RTFA'd and it's basically a list of new input device methods. Big deal. You can put the flyest multi-touch interface on Windows, and while you can move the windows around with both your hands, you will still get pissed off that you get interrupted in the middle of your browsing session with "Do you want to remember this password?" (Firefox), that you can't undo that "transfer shares" action in Quicken, that setting up automated backups in Windows is still hard, and using them to recover is harder.

      None of
    • Re: (Score:3, Insightful)

      by ELProphet ( 909179 )
      Star Trek.

      The LCARS interface, designed by Michael Okuda for TNG, is really a vision of what I would like to see. A large touch area that dynamically updates (intelligently - eg, the way I specify) its touch areas based on state. The keyboard in front of me takes most input, but I can touch specific areas on the screen for more esoteric actions - buttons, tabs, anything I'd normally "click" on. I can move my finger much faster and more precisely than my mouse, and I can type faster on a regularly sized key
    • I think the main mistake people make in looking at these interfaces is looking for something to replace the keyboard. In reality, keyboards are extremely efficient methods of text-input, and I don't expect they'll be replaced anytime soon. Even speech-recognition isn't as simple and efficient as typing unless you're doing it for specific tasks.

      And that's where these new input methods could potentially shine: in specific tasks. I don't think computer programmers will be using multi-touch for entering tex

  • Ahhh, "vaporware." It's the tag that dismisses everything in record time.

    I don't understand what some folks would have these research centers do. Not work on new GUIs? Why not? Remember, many things never left PARC's labs. Some of it did and even more of it went into the current crop of GUIs that we have today. (One can debate on the ethics behind how the ideas made it out of PARC.) All of that research, even the stuff that didn't work, helped to achieve a better, more polished end-result.

    And, of course, ou
  • by cliffski ( 65094 ) on Tuesday November 27, 2007 @01:45PM (#21494287) Homepage
    Trying to build a 3D interface that will 'simplify' our storage of data is just bollocks.
    I know where pretty much everything on my PC is. All my documents live in a sensible directory structure, and even if I lsoe one, I can do a desktop search.
    In the real world, I'm very confused. where is that letter? is it on my desk? in the desk drawer? downstairs on the bookcase? did I leave it in the car? in a kitchen drawer maybe? is that it? is it upside down? I don't recognise it without a filename...

    My simple 2D desktop filing system is better than my real life one. don't try and make things worse just so we all need a 3D card to list our documents.
  • Forget 3D, multi-touch, or any other UI change -- get me more screen space first. I'm just ready for our computer desktops to actually have physical desktop sized displays. Repeated studies and surveys say that multiple monitors or larger monitors increase productivity.

    Oh and maybe bring back the "turbo" button and put it on the mouse to acutally cross all that screen space...

The use of money is all the advantage there is to having money. -- B. Franklin

Working...