Angel \”Java\” Lopez on Blog

October 15, 2009

A 10 finger Multitouch User Interface

Filed under: Interface, Video — ajlopez @ 10:34 am

This a video demostration new ideas about a multitouch device from http://10gui.com/:

Over a quarter-century ago, Xerox introduced the modern graphical user interface paradigm we today take for granted.
That it has endured is a testament to the genius of its design. But the industry is now at a crossroads: New technologies promise higher-bandwidth interaction, but have yet to find a truly viable implementation.
10/GUI aims to bridge this gap by rethinking the desktop to leverage technology in an intuitive and powerful way.

More info at http://10gui.com/background/. You can follow Clayton Miller in Twitter: @claymill.

It’s interesting to see the drawing pad idea now leveraged with multitouch. This way, we could add multitouch capabilities to any monitor, hardware, instead of using a dedicated multitouch display. The problem: the fingers are out of our vision. The solution: the fingers are tracked and displayed on screen, in the same way we usually track the mouse using the onscreen arrow cursor.

The other interesting point is the “con10uum”, and the finger set interaction. I guess it’s like learning to play a musical instrument: you must practice, but with little effort, the finger gestures seem practical.

Angel “Java” Lopez
http://www.ajlopez.com
http://twitter.com/ajlopez

September 8, 2009

Microsoft Surface Demo: Patient Consultation Interface by Infusion

Filed under: .NET, Interface, Software Development — ajlopez @ 9:40 am

Currently, I’m working in an agile team, development a health care and administration application. I’m a newbie to health development world, but I’m impressed about the complexity and variety of requirements and opportunities to explore and exploit. It’s an exciting field for development.

Presenting patient information to medical professionals can be a challenging job. One of the team members just discovered this video, from http://www.infusion.com Microsoft partner, demoing a Surface application to view patient information:

I read at Youtube video page information:

MICROSOFT SURFACE PATIENT CONSULTAION INTERFACE
The Microsoft Surface Patient Consultation Interface enables doctors to relate complex concepts through simple interactions.

APPLICATION COMPONENTS

The Surface Patient Consultation Interface augments and facilitates the conversations that a doctor regularly has with his or her patients through a unique, interactive representation on the Microsoft Surface. With the use of static and active media elements, a doctor is able to demonstrate and relate complex medical procedures or conditions in laymans terms to their patients.

Doctors are able to use this tool to exchange content and information with their patients, adding a feeling of security to the transfer of electronic information between doctor and patient. Through the use of slide menus, touch interaction, and a simple navigation system, the application gives doctors the opportunity to provide their patients with a valuable educational experience.

The application is divided into 2 distinct views and makes use of five interaction points:

VIEWS:

The Content View allows the viewing of shared content in a free-form fashion. This view facilitates easy observation and a simple summary of any topics shared during a session.

The Anatomic View presents content for viewing in the context of the human body. This view enables the uncomplicated observation of specific diagnosis information and educational content.

Within both views, content can be manipulated to allow doctor and patient to easily see and access information together. The three primary forms of content that can be displayed include: documents, photographs, and videos. The capability also exists for presenting additional content such as 3D models.

INTERACTION POINTS:

The Content and Anatomic views are traversed via 5 common elements.

PERSONAL IDENTIFICATION components enable both the patient and doctor to share and store information. Identification occurs when the patient or doctor places their identification card on the Surface. For the patient, the identification card provides the ability to share and receive content from their Microsoft HealthVault account. For doctors, identification allows them to share generic and educational content with the patient.

The ANATOMIC LOCATOR enables the doctor to focus on a specific area of the body. This action is performed by selecting and manipulating one of five body types that can be used for accessing content: exterior, organ, circulatory, nervous and skeletal.

The ORB MENU draws data from the patients HealthVault account when a patient enters the content view. This hierarchical and easily navigated menu enables the selection of new content for the current session through the selection and dragging of content orbs located near the patients HealthVault card.

The WEB MENU allows the doctor to display content within the Anatomical View. Once a body type is selected, key points on the body relating to the shared content are highlighted. This content includes documents, static images, and videos arranged around the point of interest.

CONTENT ITEMS are a part of the overall interaction within the application and consist of documents, photographs, and videos. These multimedia tools are embedded into the patients information, interaction points within body types, or any other educational portion of the application.

To learn more about Infusion and Microsoft Surface, visit: http://www.infusion.com or email surface@infusion.com.

More info about Infusion works with Surface, at their blogs:

http://www.infusion.com/surfaceblog/

There are interesting topics, as tips for Surface development and UI design.

Angel “Java” Lopez
http://www.ajlopez.com
http://twitter.com/ajlopez

September 3, 2009

The mother of all Demos

Filed under: Computer History, Interface — ajlopez @ 10:39 am

Last year, I wrote about this subject, but in Spanish:

http://msmvps.com/blogs/lopez/archive/2008/06/02/la-madre-de-todas-las-demos.aspx

The history is related in more detail at:

http://en.wikipedia.org/wiki/The_Mother_of_All_Demos

The Mother of All Demos is a name given retrospectively to Douglas Engelbart’s December 9, 1968 demonstration at the Fall Joint Computer Conference (FJCC) at the Convention Center in San Francisco, in which a number of experimental technologies that have since become commonplace were presented. The demo featured the first computer mouse the public had ever seen, as well as introducing interactive text, video conferencing, teleconferencing, email, and hypertext.

Yesterday, in a private email list, I received this list of videos. Enjoy! Info by bigkif:

On December 9, 1968, Douglas C. Engelbart and the group of 17 researchers working with him in the Augmentation Research Center at Stanford Research Institute in Menlo Park, CA, presented a 90-minute live public demonstration of the online system, NLS, they had been working on since 1962. The public presentation was a session in the of the Fall Joint Computer Conference held at the Convention Center in San Francisco, and it was attended by about 1,000 computer professionals. This was the public debut of the computer mouse.

But it was not only the mouse:

But the mouse was only one of many innovations demonstrated that day, including hypertext, object addressing and dynamic file linking, as well as shared-screen collaboration involving two persons at different sites communicating over a network with audio and video interface.

Angel “Java” Lopez
http://www.ajlopez.com
http://twitter.com/ajlopez

August 28, 2009

Augmented Reality in IPhone Application

Filed under: Augmented Reality, Interface, Mobile — ajlopez @ 9:18 am

These are exciting days to IPhone Applications. The API to Augmented Reality support in IPhone it will be released with the next version of the IPhone Operating System. But there are some applicactions now using the new features.

One (possible the first) application is the Paris Metro Subway app:

Yesterday, Robert Scolbe discovered an Augmented Reality easter egg in the new Yelp applicacion. According to:

the new Yelp app includes an AR easter egg

Social review service Yelp has snuck the first Augmented Reality (AR) iPhone app specifically for the US into the iTunes App Store. The undisclosed new feature allows iPhone 3Gs owners to shake their phones three times to turn on a view called "the Monocle." This view uses the phone’s GPS and compass to display markers for restaurants, bars and other nearby businesses on top of the camera’s view.

Robert Scoble FriendFeed pub: http://friendfeed.com/scobleizer/e6e411b4/new-yelp-iphone-app-is-also-out-there-cool-easter

Download the new Yelp app (came out yesterday). So you shake your iPhone 3 times. That activates a feature called Monocle. A message should come up if you activated it. A blue box will come up saying "the Monocle has been activated." It will create a button in the top right corner. Now you should be able to look at the bars, restaurants, etc. Only works on iPhone 3GS. –

And now, Presslite (the same company that made Paris Metro Subway) just added Augmented Reality to its London Bus app:

There is no info about the API used (Presslite didn’t revealed nothing about tech). Candidates: ARToolkit ,iPhoneARToolkit, ChromelessImagePickerController.

More info at:

The Wall Has Fallen: 3 Augmented Reality Apps Now Live in iPhone App Store
Yelp Brings First US Augmented Reality App to iPhone Store
First iPhone Augmented Reality App Appears Live in App Store

Angel “Java” Lopez
http://www.ajlopez.com/en
http://twitter.com/ajlopez

Blog at WordPress.com.