TEDTalks2009---Pattie Maes & Pranav Mistry :「第六感官」裝置
Pattie Maes and Pranav Mistry demo SixthSense
I've been intrigued by this question of whether we could evolve or develop a sixth sense. A sense that would give us seamless access and easy access to meta-information or information that may exist somewhere that may be relevant to help us make the right decision about whatever it is that we're coming across. And some of you may argue, well, don't today's cell phones do that already? But I would say no. When you meet someone here at TED -- and this is the top networking place, of course, of the year -- you don't shake somebody's hand and then say, "Can you hold on for a moment while I take out my phone and Google you?" Or when you go to the supermarket and you're standing there in that huge aisle of different types of toilet papers, you don't take out your cell phone, and open a browser, and go to a website to try to decide which of these different types of toilet papers is the most ecologically responsible purchase to make?
So we don't really have easy access to all this relevant information, that can just help us make optimal decisions about what to do next and what actions to take. And so, my research group at the Media Lab has been developing a series of inventions to give us access to this information in a sort of easy way, without requiring that the user changes any of their behavior. And I'm here to unveil our latest effort, and most successful effort so far, which is still very much a work in process I'm actually wearing the device right now and we've sort of cobbled it together with components that are off the shelf -- and, by the way, only cost 350 dollars at this point in time.
I'm wearing a camera, just a simple webcam, a portable, battery-powered projection system with a little mirror. These components communicate to my cell phone in my pocket which acts as the communication and computation device. And in the video here we see my student Pranav Mistry who's really the genius who's been implementing and designing this whole system. And we see how this system let's him walk up to any surface and start using his hands to interact with the information that is projected in front of him. The system tracks the four significant fingers. In this case, he's wearing simple marker caps that you may recognize. But if you want a more stylish version you could also paint your nails in different colors.
And the camera basically tracks these four fingers and recognizes any gestures that he's making so he can just go to, for example, a map of Long Beach, zoom in and out, et cetera. The system also recognizes iconic gestures such as the take a picture gesture, and it takes a picture of whatever is in front of you. And when he then walks back to the Media Lab, he can just go up to any wall and project all the pictures that he's taken, sort through them and organize them, and re-size them, et cetera, again using all natural gestures. So, some of you most likely were here two years ago and saw the demo by Jeff Han or some of you may think, "Well doesn't this look like the Microsoft Surface Table?" And yes, you also interact using natural gestures, both hands, et cetera. But the difference here is that you can use any surface, you can walk to up to any surface, including your hand if nothing else is available and interact with this projected data The device is completely portable, and can be ... (Applause)
So one important difference is that it's totally mobile. Another even more important difference is that in mass production this would not cost more tomorrow than today's cell phones and would actually not sort of be a bigger packaging -- could look a lot more stylish than this version that I'm wearing around my neck. But other than letting some of you live out your fantasy of looking as cool as Tom Cruise in "Minority Report," the reason why we're really excited about this device is that it really can act as one of these sixth sense devices that gives you relevant information about whatever is in front of you. So we see Pranav here going into the supermarket and he's shopping for some paper towels. And, as he picks up a product the system can recognize the product that he's picking up using either image recognition or marker technology, and give him the green light or an orange light. He can ask for additional information. So this particular choice here is a particularly good choice, given his personal criteria. Some of you may want the toilet paper with the most bleach in it rather than the most ecologically-responsible choice.
(Laughter)
If he picks up a book in the bookstore, he can get an Amazon rating. It gets projected right on the cover of the book. The is Juan's book, our previous speaker, which get a great rating, by the way, at Amazon. And so, Pranav turns the page of the book and can then see additional information about the book -- reader comments, maybe sort of information by his favorite critic, et cetera. If he turns to a particular page he finds an annotation by maybe an expert of a friend of ours that gives him a little bit of additional information about whatever is on that particular page. Reading the newspaper -- it never has to be outdated.
(Laughter)
You can get video annotations of the event that you're reading about You can get the latest sports scores et cetera. This is a more controversial one.
(Laughter)
As you interact with someone at TED, maybe you can see a word cloud of the tags, the words that are associated with that person in their blog and personal webpages. In this case, the student is interested in cameras, et cetera. On your way to the airport, if you pick up your boarding pass, it can tell you that your flight is delayed, that the gate has changed, et cetera. And, if you need to know what the current time is it's as simple as drawing a watch -- (Laughter) (Applause) on your arm.
So that's where we're at so far in developing this sixth sense that would give us seamless access to all this relevant information about the things that we may come across. My student Pranav, who's really, like I said, the genius behind this.
(Applause)
He does deserve a lot of applause because I don't think he's slept much in the last three months, actually. And his girlfriend is probably not very happy about him either. But it's not perfect yet, it's very much a work in progress And who knows, maybe in another 10 years we'll be here with the ultimate sixth sense brain implant. Thank you.
(Applause)
Pattie Maes and Pranav Mistry 示範「第六感官」裝置
我一直對這個問題很感興趣,就是我們能否演進或發展出第六感官?這種感官能讓我們直接、便捷的獲得相關資訊;或可能存於某處,能幫助我們對當下所遇問題,做出正確決定的資訊。你們可能會說,現今的手機不就已經做到這點嗎?但我不認為如此。當你在TED遇到某人;毫無疑問的,這是每年最佳的社交場合。你不可能跟人家握手,然後說:「你能等一會嗎?我要用手機Google一下你」。或是當你到超級市場,站在龐大的貨架前,面對著各式各樣的衛生紙,你並不會拿出手機,打開瀏覽器,連上一個網站,然後試著決定各種不同的衛生紙中,該買哪種才是最環保的?
所以說,我們並沒有一個便捷的管道,獲取所有相關資訊,來幫助我們做出最佳決定。下一步要做什麼,該採取什麼行動。所以呢,我在媒體實驗室的研發團隊,正在研發一系列的產品,讓我們能取得這些資訊。利用較簡便的方式,使用者也不需改變他的行為。我要在這裡展示我們最新努力的成果;而至今最成功的樣品仍在不斷改進中,我現在正佩戴著它。我們利用現成的零件將它組合完成─順便提一下,目前這個裝置成本只有350美元。
我佩戴著一個攝影機,就是個簡單的網路攝影機,和可攜式電池供電的投影系統,及一個小鏡子。這些組件連結上我口袋中的手機,作為執行通訊及運算功能的裝置。影片裡是我的學生,Pranav Mistry-他是設計並製造整套系統背後的天才。我們可以看到這個系統如何運作,使他能夠走到任何平面前,用他的手和投影在面前的資訊進行互動。系統能追蹤這四根主要的手指,在這裡,你們可能有發現他手指上套著一般的奇異筆蓋,如果你想要比較時髦的型式,也可以把指甲塗上不同顏色。
攝影機基本上會追蹤這四根手指,並辨識出他做的任何手勢。例如,他可以打開一幅長堤的地圖,進行放大或縮小等操作。這個系統也能識別「象徵性手勢」。例如,當你做出這個「拍照手勢」,它就會拍下你面前的事物;當他回到媒體實驗室後,他可以走到任何一面牆前,將所有他拍下的照片投影出來,瀏覽它們、進行分類管理、及調整大小等等。同樣的,全都是使用自然手勢。兩年前,你們當中有些人可能也坐在這裡,看了Jeff Han所做的示範。或許有些人會想,「這不是很像微軟的Surface Table嗎?」沒錯,這兩套系統都是以自然手勢互動,用雙手來操作等等。但不同點在於,這個可以在任何表面上操作,你可以走到任何平面前;如果沒有可用的表面,自己的手也行,再與這些投影出來的資料進行互動。這裝置是可以隨身攜帶的,而且可以用來……(掌聲)。
最重要的不同點,它完全是移動式的。另一項更重要的不同點是,量產後不會比今天的手機還貴,體積也不會像我所戴的這個這麼大─看起來會比我脖子上掛的型式還要時髦的多。除了讓你們實現變得跟在「關鍵報告」中的湯姆‧克魯斯一樣酷的夢想,真正讓我們對這個裝置感到非常興奮的是,它確實可以充當一種「第六感官」,為你提供任何眼前事物的相關資訊。我們看到Pranav進入超級市場,打算買一些紙巾,當他拿起一樣產品,系統可以辨識出他手上的產品,利用影像辨識或是標識技術,亮起綠燈或是橘燈。他也能要求瞭解更進一步的資訊,所以目前這個選擇,根據他個人標準,是個最佳選擇。有些人可能想買含最多漂白劑的衛生紙,而不是最環保的衛生紙。
(笑聲)
當他在書店拿起一本書時,他能取得亞馬遜網站對這本書的評分,這些資訊會直接投影到書的封面上。這本是之前演講者Juan的著作,順帶一提,這書在亞馬遜網站大受好評。然後,當Pranav翻閱這本書時,還能看到更多關於這本書的資訊-讀者評論、他最喜歡的書評家給的評語等等。如果他翻到特定的某頁,他能找到我們某位專家朋友寫的註解,提供他關於那頁內容更多的資訊。看報紙的時候,永遠會都有最新消息。
(笑聲)
你可以看到與正在閱讀事件相關的影片報導,也能看到最新的體育比賽分數等等。接下來是一個較具爭議性的功能。
(笑聲)
當你在TED與某人交流時,你或許會看到有一團字的標籤,有著與那個人相關的字,來自他們的部落格或個人網頁。以這個例子來說,可以看到這位學生對相機有興趣等等。在你前往機場的路上,如果你拿起登機證,它可以告訴你飛機誤點了,或是登機門已更改等等。還有,如果你需要知道現在的時間,非常簡單,就畫個錶(笑聲)(掌聲),在你的手臂上。
我們目前就是進展到這裡。研發這個第六感官,能夠直接給予我們周遭事物的相關資訊,這是我的學生Pranav,像我說的,他是這些背後的天才。
(掌聲)
他確實應該得到這麼多掌聲,因為我知道,他這三個其實沒睡多少,他的女朋友大概也蠻氣他的。但它還不是那麼完美,還需要不斷地改進。誰知道呢?或許在未來十年間,我們會帶著植入腦內的終極第六感官來這裡,謝謝。
http://www2.myoops.org/main.php?act=course&id=2114
2009年2月演講,2009年3月在TED上線
TED繁體原譯:Daniel Chou
TED繁體編輯:Celia Yeung
TED原譯網頁
編輯:洪曉慧
後製:洪曉慧
字幕影片後制:謝旻均
订阅:
博文评论 (Atom)
没有评论:
发表评论