Many folks would like to see us back on the Moon and developing its resources.

Saturday, June 23, 2012

A tribute to Turing, the father of modern computing

Computers use to be people that did computing for the war effort.  Rooms full of women doing computations by hand.  Today computers are logic devices buried in silicon chips.  

There have been many steps in between and Alan Turing helped greatly in getting us started on that path.  The Google Blog has a nice write up to help us look back and reflect before looking forward again.  

Where will the path take us?
- LRK -

---------------------
http://googleblog.blogspot.com/
A tribute to Turing, the father of modern computing
June 22, 2012 at 4:00 PM
“The past is a foreign country—they do things differently there.” It’s a saying that rings especially true in the world of technology. But while innovating requires us to focus on the future, there are times when it’s important to look back. Today—the 100th anniversary of Alan Turing’s birth—is one such moment. 

Statue of Alan Turing at Bletchley Park

Turing’s life was one of astounding highs and devastating lows. While his wartime codebreaking saved thousands of lives, his own life was destroyed when he was convicted for homosexuality. But the tragedy of his story should not overshadow his legacy. Turing’s insight laid the foundations of the computer age. It’s no exaggeration to say he’s a founding father of every computer and Internet company today. 

snip
---------------------

A Wikipeda link.
- LRK -

---------------------
Alan Mathison Turing, OBE, FRS (play /ˈtjʊərɪŋ/ tewr-ing; 23 June 1912 – 7 June 1954), was a British mathematician, logician, cryptanalyst andcomputer scientist. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which played a significant role in the creation of the modern computer.[1][2] Turing is widely considered to be the father of computer science and artificial intelligence.[3]
During World War II, Turing worked for the Government Code and Cypher School (GCCS) at Bletchley Park, Britain's codebreaking centre. For a time he was head of Hut 8, the section responsible for German naval cryptanalysis. He devised a number of techniques for breaking German ciphers, including the method of the bombe, an electromechanical machine that could find settings for the Enigma machine.
After the war he worked at the National Physical Laboratory, where he created one of the first designs for a stored-program computer, the ACE. In 1948 Turing joined Max Newman's Computing Laboratory at Manchester University, where he assisted in the development of the Manchester computers[4] and became interested in mathematical biology. He wrote a paper on the chemical basis of morphogenesis,[5] and he predictedoscillating chemical reactions such as the Belousov–Zhabotinsky reaction, which were first observed in the 1960s.
Turing's homosexuality resulted in a criminal prosecution in 1952, when homosexual acts were still illegal in the United Kingdom. He accepted treatment with female hormones (chemical castration) as an alternative to prison. He died in 1954, just over two weeks before his 42nd birthday, fromcyanide poisoning. An inquest determined it was suicide; his mother and some others believed his death was accidental. On 10 September 2009, following an Internet campaign, British Prime Minister Gordon Brown made an official public apology on behalf of the British government for the way in which Turing was treated after the war because of his sexual orientation.[6]

snip
---------------------

More in the news.
- LRK -

Alan Turing: Inquest's suicide verdict 'not supportable'

By Roland PeaseBBC Radio Science Unit23 June 2012 Last updated at 03:52 ET

Alan Turing, the British mathematical genius and codebreaker born 100 years ago on 23 June, may not have committed suicide, as is widely believed.
At a conference in Oxford on Saturday, Turing expert Prof Jack Copeland will question the evidence that was presented at the 1954 inquest.
He believes the evidence would not today be accepted as sufficient to establish a suicide verdict.
Indeed, he argues, Turing's death may equally probably have been an accident.

snip---------------------
Where will computers take us in the future?  Will they become more like the human computers of old and have original thoughts?
- LRK -

---------------------

Robotics: Anticipating Asimov

by PAUL GILSTER on JUNE 21, 2012
My friend David Warlick and I were having a conversation yesterday about what educators should be doing to anticipate the technological changes ahead. Dave is a specialist in using technology in the classroom and lectures all over the world on the subject. I found myself saying that as we moved into a time of increasingly intelligent robotics, we should be emphasizing many of the same things we’d like our children to know as they raise their own families. Because a strong background in ethics, philosophy and moral responsibility is something they will have to bring to their children, and these are the same values we’ll want to instill into artificial intelligence.
The conversation invariably summoned up Asimov’s Three Laws of Robotics, first discussed in a 1942 science fiction story (‘Runaround,’ in Astounding Science Fiction‘s March issue) but becoming the basic principles of all his stories about robots. In case you’re having trouble remembering them, here are the Three Laws:
  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Asimov is given credit for these laws but was quick to acknowledge that it was through a conversation with science fiction editor John Campbell in 1940 that the ideas within them fully crystallized, so we can in some ways say that they were a joint creation. As Dave and I talked, I was also musing about the artificial intelligence aboard the Alpha Centauri probe in Greg Bear’s Queen of Angels (1990), which runs into existential issues that force it into an ingenious solution, one it could hardly have been programmed to anticipate.
snip

Here is to looking up, maybe near, maybe far, maybe even a star.
And the computer will take you there. And beware.
Morphogenesis (from the Greek morphê shape and genesis creation, literally, "beginning of the shape") is the biological process that causes an organism to develop its shape. It is one of three fundamental aspects of developmental biology along with the control of cell growth and cellular differentiation.
The process controls the organized spatial distribution of cells during the embryonic development of an organism. Morphogenesis can take place also in a mature organism, in cell culture or insidetumor cell masses. Morphogenesis also describes the development of unicellular life forms that do not have an embryonic stage in their life cycle, or describes the evolution of a body structure within a taxonomic group.
Morphogenetic responses may be induced in organisms by hormones, by environmental chemicals ranging from substances produced by other organisms to toxic chemicals or radionuclidesreleased as pollutants, and other plants, or by mechanical stresses induced by spatial patterning of the cells.
snip
==============================================================
Artificial intelligence (AI) is the intelligence of machines and the branch of computer science that aims to create it. AI textbooks define the field as "the study and design of intelligent agents"[1]where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.[2] John McCarthy, who coined the term in 1955,[3] defines it as "the science and engineering of making intelligent machines."[4]
AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other.[5] Some of the division is due to social and cultural factors: subfields have grown up around particular institutions and the work of individual researchers. AI research is also divided by several technical issues. There are subfields which are focussed on the solution of specific problems, on one of several possible approaches, on the use of widely differing tools and towards the accomplishment of particular applications. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects.[6] General intelligence (or "strong AI") is still among the field's long term goals.[7] Currently popular approaches include statistical methodscomputational intelligence and traditional symbolic AI. There are an enormous number of tools used in AI, including versions of search and mathematical optimizationlogicmethods based on probability and economics, and many others.
The field was founded on the claim that a central property of humans, intelligence—the sapience of Homo sapiens—can be so precisely described that it can be simulated by a machine.[8] This raises philosophical issues about the nature of the mind and the ethics of creating artificial beings, issues which have been addressed by mythfiction and philosophy since antiquity.[9] Artificial intelligence has been the subject of optimism,[10] but has also suffered setbacks[11] and, today, has become an essential part of the technology industry, providing the heavy lifting for many of the most difficult problems in computer science.[12]
snip
==============================================================

WHAT THE MIND CAN CONCEIVE, AND BELIEVE, IT WILL ACHIEVE - LRK -

==============================================================

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Moon and Mars - Videos

Loading...
Loading...