Human-Computer to two fields of research that later

Human-Computer Interaction

 

Part 1: 
A Moving Target – The Evolution of Human-Computer Interaction

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

In this
paper Jonathan Grudin covered major threads of research in four disciplines:
human factors, information system, computer science, library and information
science. CHI (computer-human Interaction) here has a narrower focus than
HCI. Cyclic patterns and cumulative
influences are evident. Convention of HCI is the Moore’s law, a law
which specifies the number of transistors on an integrated circuit, but it is
useful to consider the broader range of phenomena that exhibit exponential
growth. Narrowly defined, Moore’s law may soon be revoked, but broadly defined
this is unlikely. Advances in technology gave rise to two fields of research
that later contributed to human-computer interaction. One focused on making the
human use of tools more efficient, the other on ways to represent and
distribute information more effectively.

                 In 19th century,
Index cards, folders, and filing cabinets were used to manage and organize
information. Later typewriters and carbon paper were used to store information
followed by mimeograph machine. After a century, photography was also used to record
lots of information. Microfilm, the most “efficient way to compress, duplicate,
and disseminate large amount of information”. They are all of Human-Tool
Interaction and Information Processing for HCI.

 When ENIAC, the first general-purpose computer
was developed, memory was extremely expensive that time. At the time the major
focus of HCI research was Computer operation, management and systems analysis,
and programming. The main aim of man-computer symbiosis, a vision outlined by
Licklider, is to bring the computing machine effectively into the formulative
parts of technical problems. Engelbart’s concept of augmenting human intellect
was to increase human capability to approach complex problems and device
solutions to them. In mid 1960s, minicomputers that could handle personal productivity
tools or a moderate database were available. Few data-processing projects were initiated
and HF and information science were emphasized. The psychologists and
computer scientists who formed the early CHI community believed that interface
design was a matter of science and engineering. Eventually, Computer scientists
were interested in CG and AI, which lead to the remarkable transformation of
HCI. SIGCHI was formed and well-received. HF & E embraced cognitive
approaches. Advent of GUI had mixed responses but it obviously attracted people
with its easiness. With the progress in LAN and internet, computers deployed
effective communication and information sharing. Early on, the Web had the more
dramatic effect on organizational IS. Gradually, CHI researchers’ focus shifted
to discretionary use of computers. Thus, HCI evolves every day.

 

 

More user-friendly and natural
interfaces for human-machine interaction will yield a plethora of advantages
and will majorly impact everyday life. They will, no doubt, offer the chance to
improve the quality of life of people who can’t take advantage of current
interfaces due to physical disabilities. Technologies like augmented reality
could help people have surreal experiences that they could never have in real
life. All of the incredibly surreal scenarios which could only be found in
science fiction movies earlier would become a reality in the future.

 

 Part 2.
Research Topic of the week

 

 

v Skinput
Turns Your Arm Into a Touchscreen

 Skinput is a new skin-based interface which
allows users to use their own hands and arms as touchscreens. It accomplishes
this by detecting the various ultralow-frequency sounds produced when tapping
different parts of the skin. In skinput, graphics are beamed onto a user’s forearm
from a pico projector embedded in an armband. Then an acoustic detector in the
armband determines which part of the display is activated by user’s touch.
Variations in bone density, size and mass mean different skin locations are
acoustically distinct. Software matches sound frequencies to specific skin
locations which allows the system to determine which button is pressed.

 

 

 

v LGT medical’s Vital Signs DSP Uses Any Smart
Device as Medical Sensor Interface (VIDEO)

LionsGate Technologies has declared a new technology offering that provides the
ability for all sorts of medical sensors to easily use smartphones and tablets
as their interface. By using the audio jack as universal way to transfer data. They’ve
already used their Vital Signs DSP technology by building a pulse oximeter that
works straight off an iPhone’s audio jack and displays readings on its screen.
This technology gives companies the ability to focus on the core technology
they’re working on, either during development or for actual production of a
cheap medical device that doesn’t need its own display

 

 

 

 

 

Throw your mobile device away, the holographic interface is
here

Holographic interface uses holograms instead
of graphic images to produce projected pictures. They shine special laser light
onto through holograms. The projected light produces bright 2 or 3D images.
While plain daylight lets you see some simple holograms, true 3-D images
requires laser-based holographic projectors.

Great concept
by Ivan Tihienko as by using holographic projection we can take our mobile
activities on the street, without having to carry your smart phone or any
device with you. Whenever wherever you want to play air hockey with your
friend, just holograph the game and start playing. Holographic GPS will also
show the way right in front of you.