Monday, April 30, 2012

Getting a textured 3D scan from just a webcam


Getting a textured 3D scan from just a webcam



Here’s an oldie but a goodie that passed us up the first time it went around the Internet. [Qi Pan], (former) PhD student at Cambridge, made a 3D modeling program using only a simple webcam. Not only does this make very fast work of building 3D models, the real texture is also rendered on the virtual object.
The project is called ProFORMA, and to get some idea of exactly how fast it is, the model of a church seen above was captured and rendered in a little over a minute. To get the incredible speed of ProFORMA, [Qi] had his webcam take a series of keyframes. When the model is rotated about 10°, another keyframe is taken and the corners are triangulated with some very fancy math.
Even though [Qi]‘s project is from 2009, it seems like it would be better than the ReconstructMe, the Kinect-able 3D scanning we saw a while ago. There’s a great video of [Qi] modeling a papercraft church after the break, but check out the actual paper for a better idea of how ProFORMA works.


Fwd: Press Release - Antik brings cost-effective device to transcode 120 streams at once

---------- Forwarded message ----------
From: "Igor Kolla, ANTIK Technology" <sobekova@dlm.sk>
Date: Apr 30, 2012 12:21 AM
Subject: Press Release - Antik brings cost-effective device to transcode 120 streams at once
To: <john.sokol@gmail.com>

Dear Sir/Madam

Please find attached a press release from ANTIK Technology:


ANTIK BRINGS COST-EFFECTIVE DEVICE TO TRANSCODE 120 STREAMS AT ONCE

Introduction:
-----------
Slovak provider of IPTV solutions ANTIK Technology launches new hardware Juice Ember Hi-Density Multi Transcoder based on ASIC able to transcode up to 15 SD or HD channels. With 8 transcoding modules customers can manage 120 channels. This professional multi-stream H.264 HD encoder and transcoder is optimal solution for companies providing OTT content delivery.


You will find the whole press release attached.

Best regards,

Igor Kolla
ANTIK Technology
Carskeho 10
040 01 Kosice
Slovakia
tel.: +421 948 228 122
e-mail: pr@antik.sk
www.antiktech.com www.juiceiptv.com

Sunday, April 29, 2012

Fwd: [svlug-announce] SVLUG May 2nd Meeting: Panel of Speakers on Comparitive Operating Systems


---------- Forwarded message ----------
From: "mark weisler"
Date: Apr 29, 2012 9:47 AM
Subject: [svlug-announce] SVLUG May 2nd Meeting: Panel of Speakers on Comparitive Operating Systems
To: <svlug-announce@lists.svlug.org>
WHEN:
  Wednesday, May 2nd, 2012
  7:00pm-9:00pm
MAIN PRESENTATION
 TOPIC: Comparing Current Open Source Unix Systems
  Snacks and beverages at this meeting will be provided by ValueHost.
 PRESENTED BY:
   A Distinguished Panel of  Speakers
 TOPIC SUMMARY:
   The panel will discuss the main current open source Unix systems including Linux, BSD, illumos, SmartOS, Plan 9,
   and possibly others. Attendees will learn more about the energy and ideas behind these operating systems
   as well as their future direction.
 ABOUT THE PRESENTERS: 
   Our panel includes Kevin Dankwardt of K Computing representing Linux; Josh Paetzel, Director of IT at IxSystems, Inc.
   for FreeBSD; Bryan Cantrill will be talking about the successor to OpenSolaris, illumos, and Joyent's own distro, SmartOS;
   Charles Forsyth, Technical Director and Co-Founder at Vita Nuova for Plan 9; and possibly others.
   Check out our flyer (340 KB PDF).
   Rick Moen, long-time volunteer to SVLUG and veteran system administrator, will be our moderator.
 LOCATION:
   Symantec
   VCAFE Facility
   350 Ellis Street (near E. Middlefield Road)
   Mountain View, CA 94043
   Directions on how to get there are listed at:
   http://www.svlug.org/directions/veritas.php
   We've tried our very best for these directions to be accurate.
   If you have any improvements to make, please let SVLUG's volunteers know!
   webmaster at svlug.org
 POST-MEETING GATHERING:
  If you just can't get enough, a smaller group usually goes to a local
  restaurant/diner after the meeting:  Frankie, Johnnie & Luigi, Too,
  939 West El Camino Real between Shoreline and Castro, Mountain View.
We look forward to seeing you there!
_______________________________________________
svlug-announce mailing list
svlug-announce@lists.svlug.org
http://lists.svlug.org/lists/listinfo/svlug-announce

Fwd: MultiTouch Introduces Ideal Interactive Displays for Museums


---------- Forwarded message ----------
From: "Chris Pfaff"
Date: Apr 29, 2012 5:06 PM
Subject: MultiTouch Introduces Ideal Interactive Displays for Museums
MultiTouch, Ltd., the world's leading producer of multi-user interactive LCD displays, today introduced new interactive multitouch display features for museum exhibitions at the MuseumExpo™, the annual meeting and exhibition for the American Association of Museums (AAM). MultiTouch's new features for its MultiTaction® Cells include new augmented reality applications; a large-scale MultiTaction Wall that supports unlimited number of concurrent users, and a Cornerstone Software Development Kit (SDK) that addresses the specific needs of museums and permanent exhibitions.
The full press release appears below.
The MultiTouch product line showcase at the Museum Expo includes an 8-foot-wide and 4-foot-tall interactive wall and a table-integrated MultiTaction 55" display that both highlight the industry leading multi-user displays targeted especially to museum segment.
MultiTouch will showcase applications from some of its leading global museum installations at MuseumExpo, including:
'Avatar' exhibition, at Science Fiction Hall of Fame in Seattle, Washington
Augmented reality application where users can find movie-themed content by placing coasters on the displays
La Biennale di Venezia in Venice, Italy
Interactive art installation where users can manipulate the original digital piece of art which returns to its initial state after a timeout
Johnson Space Center in Houston, Texas
Application to commemorate the first space shuttle flight with built-in photos and video footage
Mob Museum in Las Vegas, Nevada
Augmented reality application where mobster-printed coasters show their position and connections within their crime 'families'
National Museum of Australia in Canberra, Australia
'Never Enough Grass' is an animated interactive exhibit that enables visitors to navigate landmark locations that played a key role in the development and expansion of the Australian pastoral industry. 'Yiwarra Kuju' is an interactive art installation that traces a famous aboriginal cattle route, the Canning Stock Route, in western Australia
University of Oregon Alumni Center (alumni search application) in Eugene, Oregon
Interactive access to the alumni database of the UO
Wells Fargo History Museums in various locations across the United States and India
Gold Panning game featuring digitalized images of real world gold nuggets
World's Fair Shanghai
Massive wall (Muro de Chile / Wall of Chile) presents the map of Chile to interact with the sights and sounds of the country
If you would like to speak with a MultiTouch executive, please contact me on 201-218-0262 or chris@chrispfafftechmedia.com.
- Chris

MULTITOUCH INTRODUCES IDEAL INTERACTIVE DISPLAYS FOR MUSEUMS
MultiTouch Introduces Interactive Features For Museums and Permanent Exhibitions,
Including MultiTaction® Wall and MultiTouch Augmented Reality Table
MUSEUMEXPO™, BOOTH #1702
FOR RELEASE ON: MONDAY, APRIL 30, 2012
MINNEAPOLIS, MN – MultiTouch, Ltd., the world's leading producer of multi-user interactive LCD displays, today introduced new interactive multitouch display features for museum exhibitions at the MuseumExpo™, the annual meeting and exhibition for the American Association of Museums (AAM). MultiTouch's new features for its MultiTaction® Cells include new augmented reality applications; a large-scale MultiTaction Wall that supports unlimited number of concurrent users, and a Cornerstone Software Development Kit (SDK) that addresses the specific needs of museums and permanent exhibitions.
MultiTouch will also showcase applications from some of its leading global museum installations at MuseumExpo, including:
'Avatar' exhibition, at Science Fiction Hall of Fame in Seattle, Washington
Augmented reality application where users can find movie-themed content by placing coasters on the displays
La Biennale di Venezia in Venice, Italy
Interactive art installation where users can manipulate the original digital piece of art which returns to its initial state after a timeout
Johnson Space Center in Houston, Texas
Application to commemorate the first space shuttle flight with built-in photos and video footage
Mob Museum in Las Vegas, Nevada
Augmented reality application where mobster-printed coasters show their position and connections within their crime 'families'
National Museum of Australia in Canberra, Australia
'Never Enough Grass' is an animated interactive exhibit that enables visitors to navigate landmark locations that played a key role in the development and expansion of the Australian pastoral industry. 'Yiwarra Kuju' is an interactive art installation that traces a famous aboriginal cattle route, the Canning Stock Route, in western Australia
University of Oregon Alumni Center (alumni search application) in Eugene, Oregon
Interactive access to the alumni database of the UO
Wells Fargo History Museums in various locations across the United States and India
Gold Panning game featuring digitalized images of real world gold nuggets
World's Fair Shanghai
Massive wall (Muro de Chile / Wall of Chile) presents the map of Chile to interact with the sights and sounds of the country
The MultiTouch product line showcase at the Museum Expo includes an 8-foot-wide and 4-foot-tall interactive wall and a table-integrated MultiTaction 55" display that both highlight the industry leading multi-user displays targeted especially to museum segment.
"The museum experience is uniquely suited to multitouch applications, which engage visitors in an interactive dialogue with the themes and stories featured," said Timo Korpela, general manager of MultiTouch Americas. "Museums present new information and perspectives on life, and we are able to create new worlds with multitouch applications that make exhibits come alive in new ways, especially through augmented reality and interaction to real life objects."
About MultiTouch Ltd.
MultiTouch is a leading developer of interactive display systems, based on patented software and hardware designs. The company is headquartered in Helsinki, Finland, with U.S. offices in Santa Clara, California and New York City. Its systems are currently in use in more than 40 countries around the globe. For more information, please visit www.multitouch.fi.
# # #
MultiTaction® is a registered trademark of MultiTouch Ltd.

Fwd: The shape of things, illuminated: Metamaterials, surface topology and light-matter interactions


---------- Forwarded message ----------
From: "Jonathan Post"
The shape of things, illuminated: Metamaterials, surface topology and
light-matter interactions
April 28, 2012 by Stuart Mason Dambrot
http://phys.org/news/2012-04-illuminated-metamaterials-surface-topology-light-matter.html
(Phys.org) -- Finding new connections between different disciplines
leads to new – and sometimes useful – ideas. That's exactly what
happened when scientists in the Department of Physics, Queens College,
City University of New York (CUNY), in collaboration with City College
of CUNY, Purdue University and University of Alberta, leveraged
mathematical topology to create an artificially nanostructured
anisotropic (exhibiting properties with different values when measured
along axes in different directions) metamaterial that can be switched
from a non-conductive dielectric state to a medium that behaves like
metal in one direction and like a dielectric another. The
metamaterial's optical properties was mapped onto a topological
transformation of an ellipsoidal surface into an hyperboloid – and
transitioning from one to the other dramatically increases the photon
density, resulting in dramatic increase in the light intensity inside
the material. The researchers state that by allowing
topologically-based manipulation of light-matter interactions, these
types of metamaterials could lead to a wide range of photonic
applications in solar cells, light emitting diodes, displays, and
quantum computing and communications.
Associate Professor Vinod M. Menon recalls that the project started
out with theoretical predication and computational modeling. "Our
subsequent experimental work was based on computational modeling of
the structures and the anticipated effects," he relates to Phys.org.
"At that point, the main challenge in describing the
ellipsoid-to-hyperboloid transition was the design of the structure
that will show the transition in the relevant spectral range.
Relatedly," Menon continues, "showing that this topological transition
manifests itself in increased rates of spontaneous emission of
emitters positioned near the metamaterial required the identification
of a suitable light emitting material. In our case, that material was
quantum dots." This critical choice of emissive material allowed the
researchers to study the enhancement in spontaneous emission in both
the elliptical and hyperbolic ranges.
Artist's interpretation of the optical topological transition
occurring in metamaterials. Here the transformation from an ellipsoid
to a hyperboloid (left) is associated with a huge increase in the
light intensity inside the metamaterial (right). Courtesy Vinod Menon
| Animation created by Vladimir Shuvayev / Queens College - CUNY.
Menon is equally to-the-point when describing the key insights,
innovations and techniques the team used to address the above
challenges. "In addition to the right photon emission source and a
suitable material system for metamaterial fabrication, it was
necessary to come up with an appropriate control sample to isolate the
effect that we were looking for."
Menon adds that the team's next steps are to reduce optical losses,
improve the quality of silver films, and look into new material
systems that will show similar effects. "Silver is the metallic
component in the metamaterial that allows us to realize the
anisotropy. Theoretically one could use any metal or even doped oxides
and semiconductors. In our case silver was used because of the lower
optical losses in the visible wavelength range, but the roughness of
silver layers used in the present structure is an issue. This will
have to be addressed in the next round of experiments," Menon
cautions. "Additionally, the optical losses in the material need to be
alleviated. Finally, approaches to enhance the transmission properties
need to be addressed for light emitting applications."
According to Menon, the team's findings impact the development of new
routes to manipulating light-matter interactions through using
metamaterials and controlling the topology of the iso-frequency
surface – that is, one having a constant frequency. "The structure
that we demonstrated shows a large increase in the light intensity
over a wide spectral range," he explains. "Such structures can help in
enhanced light harvesting, which could result in more efficient solar
cells. One could also envision using these to develop single photon
sources necessary for quantum communication protocols and quantum
computers. Finally, through engineering of transmission properties of
these systems, and by combining them with light emitters, one may also
realize super bright LEDs that would be useful for display
applications."
Venturing further afield, Menon descries more exotic possibilities.
"Ideas of light manipulation used here could be extended for control
of thermal properties as well. More esoteric are the proposed ideas of
realizing a table top optical black hole and manipulation of
space-time curvature – and in fact, these proposals have been recently
made1,2 by one of my co-authors, Evgenii Narimanov, and his
collaborators."
More information: Topological Transitions in Metamaterials, Science 13
April 2012: Vol. 336 no. 6078 pp. 205-209, doi:
10.1126/science.1219171
Related:
1Optical black hole: Broadband omnidirectional light absorber, Applied
Physics Letters 95, 041106 (2009), doi: 10.1063/1.3184594
2Metric Signature Transitions in Optical Metamaterials, Physics Review
Letters 105, 067402 (2010), doi: 10.1103/PhysRevLett.105.067402

FCC To Require TV Stations To Post Rates For Campaign Ads - Slashdot

http://politics.slashdot.org/story/12/04/29/0248207/fcc-to-require-tv-stations-to-post-rates-for-campaign-ads

Optus Loses Second Battle In Aussie TV-Timeshifting Battle - Slashdot

http://yro.slashdot.org/story/12/04/28/1659250/optus-loses-second-battle-in-aussie-tv-timeshifting-battle

"After winning an initial legal battle to continue its mobile 'TV Now' terrestrial-television re-broadcasting service, Optus has lost a second battle in Australian Federal court. The Optus system 'time-shifted' broadcast signals by two minutes, and then streamed them to customers' mobile phones. In the previous ruling, the judge sided with Optus' argument that since the customer requested the service, they were the ones recording the signal, and thus it was fair-use under Australian copyright law. However, the new ruling declared Optus to be the true entity recording and re-distributing the broadcasts, and thus in violation of the law. There has been no word yet on whether Optus will appeal the decision, but as they could be retroactively liable for a great deal of damages, it is almost certain that they will."

Friday, April 27, 2012

Gigantic liquid crystal display is like a giant calculator

Some say he turns on his soldering iron by saying, “Flame on!.” He deadbug solders – QFP packages. All we know is he’s called [stig] and he sent in an awesome an awesome video of a new display at the Nature Research Center in Raleigh, North Carolina. It’s a 10 foot by 90 foot LCD display that uses 6 inch square glass panels containing the same liquid crystals you’d find in a calculator.
The display/installation is called Patterned by Nature and is built using 3600 pieces of LCD privacy glass. When a voltage is applied to the glass it changes from clear to opaque. While this technology has been around for decades (just look at your calculator), only in the last few years has LCD privacy glass come down in price to make a project like this economical.
The gigantic display was created in part by Sosolimited, an art studio who has made a similar project before. The display hanging in the atrium of the Raleigh Nature Research Center is amazingly efficient for its size  drawing only 75 watts.
If you’d like to try your hand at a similar build, we wish you luck; this LCD glass is still somewhat expensive but perhaps in a few years the price will come down enough that we can play Tetris on the side of a building.



Fwd: Is glasses-free 3D the next big tablet feature?

---------- Forwarded message ----------
From: "MasterImage 3D" <display@masterimage3d.com>
Date: Apr 27, 2012 10:40 AM
Subject: Is glasses-free 3D the next big tablet feature?
To: <sokol@videotechnology.com>



Is 3D the next big tablet feature?

From James Cameron talking up 3D tablets, to a close-up look at our 10.1" display in the Los Angeles Times, there's been a lot of buzz about autostereoscopic tablets in recent weeks. 

Here are just a few articles we thought would interest you:





This message was sent to sokol@videotechnology.com from:
MasterImage 3D | 5358 Melrose Ave. | Los Angeles, CA 90038
Master Image 3D
Unsubscribe

Thursday, April 26, 2012

Pirate Eye | The Leader in Anti-Piracy Technology

Their system is designed to detected cameras in Dark theaters and alert a Operations Center that verifies is then contacts the authorities to go about making an arrest. 

http://www.pirateeye.com/


Anti-paparazzi, Anti-Photo,  technology. 

Open Source TV

This is very cool.
http://www.theopensource.tv/

Here is cute little movie.   TV Mind Control.
http://archive.org/details/tvmindcontrol

YouTube 3D Video



See this link:
YouTube 3D Video  



  •  Upload 3D Content
  • Convert 2D video to 3D 


  • Watch 3D Content



    Set up HTML5 Stereo View

    HTML5 stereo view allows you to view 3D video on YouTube using specialized hardware and software. We currently support the following solutions for viewing YouTube 3D video through HTML5 stereo view:
    NVIDIA 3D Vision Configure 3D Vision for watching YouTube 3D (provided by NVIDIA)

    For help setting up your hardware or software for HTML5 stereo view, select a link from the above list. Please note that the above links are not controlled nor content on them endorsed by YouTube.

    Why is it called HTML5 stereo view?

    HTML5 stereo view is a standards compliant way of watching 3D content using special hardware and software:
    • HTML5 is the technology that we use to display video
    • Stereo view stands for stereoscopic view, meaning the ability to show you a different image for each eye

    HTML5 stereo view adheres to web standards so that new devices can easily support watching 3D on YouTube. YouTube aims to expand support for additional standards compliant devices in the future.


    Monday, April 23, 2012

    Microsoft Media Platform will support MPEG-DASH,

    ---------- Forwarded message ----------
    From: "3D CineCast" <olivier.amato@itbroadcastanddigitalcinema.com>
    Date: Apr 23, 2012 4:06 AM
    Subject: 3D CineCast
    To: <john.sokol@gmail.com>


    3D CineCast



    Posted: 22 Apr 2012 01:06 PM PDT
    Microsoft Media Platform will support MPEG-DASH, a recently ratified ISO/IEC standard for dynamic adaptive streaming over HTTP. Microsoft plans to support DASH and other open standards as part of an industry-wide initiative to establish reliable video delivery to Internet connected devices and enable true interoperability between adaptive streaming technologies from different vendors.

    Much like Smooth Streaming, DASH uses Extensible Markup Language (XML) to describe media presentations in a manifest file which references media streams stored in ISO Base Media File Format. Combined with the standard HTTP protocol and existing Web content delivery networks, the DASH standard enables a better video experience for end users by automatically adapting to varying client and network conditions during playback.

    Taking advantage of similarities between Smooth Streaming and DASH, Windows Azure Media Services will add support for DASH Live Profile later this year so that both Smooth Streaming and DASH devices can access the same live and on-demand video presentations using either manifest format. This will enable a smooth transition to DASH for millions of devices and services currently using Smooth Streaming.

    In addition to server-side support, Microsoft will also add support for DASH to all its Smooth Streaming client development kits. The first step will be to enable DASH support in the Smooth Streaming Client for Silverlight, followed by support in Smooth Streaming Client SDKs for Windows 8, iOS, Xbox, Windows Phone and Smooth Streaming Client Porting Kit for embedded devices.

    Microsoft is also contributing to W3C efforts to standardize adaptive streaming APIs in HTML5 so that DASH Web applications may also be written in HTML5 and ECMAScript (JavaScript) in the future without requiring browser plug-ins such as Silverlight and Flash to enable advanced streaming media scenarios.

    Microsoft has contributed to the development of the DECE UltraViolet video format which enables download and adaptive streaming of premium movie and TV content; and to various international broadcast standards and consortia so that a common protected video format based on DECE Common File Format, MPEG Common Encryption, and MPEG-DASH specifications will be supported by all adaptive streaming services and devices to enable reliable interoperability for consumers, just like broadcast TV and DVD.

    What Microsoft services will have DASH support?
    Windows Azure Media Services will provide encoding, encryption, and streaming support for Application Profiles based on DASH "ISO Base Media Live Profile" this year. Both DASH manifests and Smooth Streaming manifests will be generated to allow the same media to be streamed by DASH clients and Smooth Streaming clients. The primary media format will conform to the PIFF 1.3 specification in addition to Live Profile, will include several features and constraints compatible with the DECE Common File Format, and may optionally include MPEG Common Encryption with PlayReady DRM support. Windows Azure Media Services will also be capable of live transformation to multiple streaming formats, including MPEG-2 Transport Streams for use with DASH M2TS Simple Profile manifests or M3U8 playlists.

    What Microsoft client technologies will have DASH support?
    Microsoft plans to add MPEG-DASH support to all client development kits that currently support Smooth Streaming. These are: Smooth Streaming Client for Silverlight; Smooth Streaming Client for Windows Phone; Smooth Streaming Client SDK for Windows 8 Metro-style applications; Xbox LIVE Application Development Kit; Smooth Streaming SDK for iOS Devices with PlayReady; and Smooth Streaming Client Porting Kit.

    Is Microsoft discontinuing Smooth Streaming?
    No. Microsoft will continue to invest in Smooth Streaming as an established technology and brand while ensuring its Smooth Streaming services, clients, tools and workflows are DASH compatible. The Smooth Streaming file format (PIFF 1.3) is already compatible with the DASH specification (ISO Base Media Live Profile) so customers and partners who are investing into creation of Smooth Streaming content today will have a clear path to making that content deliverable to DASH clients in the future.

    What is Common Encryption?
    Common Encryption is an MPEG standard using AES-128 media encryption that enables a single protected ISO Base Media file or adaptive streaming presentation to be used with any DRM system supported by a device and the publisher. The standard is designated ISO/IEC 23001-7 "Information technology – MPEG systems technologies – Part 7: Common encryption in ISO base media file format files". Prior to this standard, a different set of files was required for each different DRM type, and interchange of files between authorized devices was generally not possible because of different DRMs.

    What is Common File Format?
    Common File Format (CFF) is a DECE video specification titled "Common File Format & Media Formats Specification" used for content download. It specifies video files based on fragmented ISO Base Media files (MPEG-4 Part 12), optionally using Common Encryption, containing AVC video, AAC audio, SMPTE Timed Text and Graphics subtitles, metadata, and several optional audio formats. All parameters required for interoperability are sufficiently specified to allow independently implemented encoders, publishers, delivery services, and devices to reliably interchange and play the same files. Different "media profiles" are specified for high definition, standard definition, and "portable" definition devices.

    The CFF requirement to use short movie fragments makes these files and compatible decoders forward compatible with DASH adaptive streaming using movie fragments as DASH Media Segments. DECE is currently in the process of specifying "Common Streaming Format" and considering DASH Application Profiles.

    What about HTML5 playback?
    The current working draft of HTML5 does not include specific support for either adaptive streaming or DRM protection. It is possible to indicate a playlist or manifest file as the source of the <video> tag, but a publisher would have no control over the behavior and presentation that each device or browser would execute in response to that manifest. There are no standard APIs to integrate the presentation decisions made by the platform with a presentation application running in the browser.

    However, there is work underway in W3C to add both adaptive streaming and content protection APIs so that a script application will be able to run in any HTML5 browser to perform DRM license acquisition and DASH adaptive streaming under the control of the script application. This will allow the script application to control adaptive heuristics, authorization, load balancing, performance reporting, targeted ad insertion, and interactive presentation and navigation of one or more adaptive presentations. These script APIs will make HTML5/JavaScript DASH applications a viable alternative to Silverlight and Flash across the full range of devices … in the future.

    You are subscribed to email updates from 3D CineCast
    To stop receiving these emails, you may unsubscribe now.
    Email delivery powered by Google
    Google Inc., 20 West Kinzie, Chicago IL USA 60610

    Sunday, April 15, 2012

    CAPTURING AUDIO & VIDEO IN HTML5

    Hot from the W3 standards committee
    HTML Media Capture   W3C Working Draft 14 April 2011


    HTML5 ROCKS site has a great article on this.

    "Audio/Video capture has been the "Holy Grail" of web development for a long time. For many years we've had to rely on browser plugins (Flash or Silverlight) to get the job done. Come on!

    HTML5 to the rescue. It might not be apparent, but the rise of HTML5 has brought a surge of access to device hardware. Geolocation (GPS), the Orientation API (accelerometer), WebGL (GPU), and the Web Audio API (audio hardware) are perfect examples. These features are ridiculously powerful, exposing high level JavaScript APIs that sit on top of the system's underlying hardware capabilities.

    This tutorial introduces a new API, navigator.getUserMedia(), which allows web apps to access a user's camera and microphone."


    I've done a little exploring and getUserMedia isn't quite ready for prime time. So far for PC's it only works with in Chrome and Opera. My first hand experience, I only tested on Chrome, was that it was grabbing the wrong camera device and there was no way to select which camera device to capture from. 

    It seems a lot of these new api's are driven by phone developers that have found HTML/JavaScript an easy way to write universally supported application.  So there are a number of tools that basically take your web site and turn it in to an Android or iPhone application.  

    Support:
    • Android 3.0 browser - one of the first implementations. Check out this video to see it in action.
    • Chrome for Android (0.16)
    • Firefox Mobile 10.0


    From my notes:

    Web video input

    Pebble Watch for iPhone and Android, The Most Successful Kickstarter Project Ever - Forbes

    http://www.forbes.com/sites/anthonykosner/2012/04/15/pebble-watch-for-iphone-and-android-the-most-successful-kickstarter-project-ever/

    Wednesday, April 11, 2012

    Video in Flex / Flash

    I want to share this as a warning and a tip.

    I have been contacted by several different companies that have undertaking building products based on Flex and are now stuck because Flex wasn't capable of extending very far beyond it's demos.   In each case they had outsource the work to someone who am an sure put in the cheapest bid.

    Flex for those who don't know about it is one of several language that can be used to make flash applications (SWF)

    Apache Flex, formerly Adobe Flex, is a software development kit (SDK) for the development and deployment of cross-platform rich Internet applications based on the Adobe Flash platform. Initially developed Macromedia and then acquired by Adobe Systems, Flex was donated to the Apache Software Foundation in 2011.  -  Wikipedia


    "I do know a bit of flex, I have it and play around with it when I can. It helps make android stuff work on other platforms and works with ellipse ...  got it by joining the adobe underemployed developers program ... so became an "official" adobe developer... ha ha  ... i find swishmax, easier ... leaving me more time to figure out what i want to do, flexbuilder ( now called flashbuilder ) does make it more cross platform compatible for stuff like making your toaster ask for more toast ( see red dwarf ) ... and doing it from your android ... " 
      - Will Crawford

    I think that says it all, "adobe underemployed developers program"

    Meaning there is a small army of flex developers that don't know video now building web video websites and web applications.  Because of this they are underbidding, they don't understand what they are getting in to and they don't have anyone to turn to when they get stuck.

    Flex makes it really easy to add video to a web site. It has some great demos with example code.  Frankly HTML5 and WebRTC should have been this easy but they missed the mark.   Flash is still one of the best options for grabbing video from a users webcam without having to make them install a native binary.  Flex make this trivial.

    Displaying a webcam’s video in a Flex VideoDisplay control





    So if you just want to grab a video demo and build it in to your site, change the layout a little, great. This is a fantastic tool, it takes the complicated and make it practical and easy.

    But be warned.  This simplicity is deceptive and comes at a price.  All the complexity is hidden. What this means is when it come time to do something more complex you can't control it's behavior. Your limited to what the libraries do and little else.

    Now add to this a crop of novice Flex developers with little to no knowledge of video and we have a brewing storm of failed development projects on the horizon.  Many that many need to start over in some other tool.


    My advice, just don't try to push the limits of video technology and you be just fine.
    But if you expect something more advanced you will soon learn the hard way that all your efforts may end up a dead end.

    CEATEC JAPAN 2012. October

    http://www.ceatec.com/2012/en/application/index.html

    Fwd: Telestream Episode Streamlines Video Encoding for Cannes Film Festival

    ---------- Forwarded message ----------
    From: "Janet Swift" <vpomailer@mailx.virtualpressoffice.com>
    Date: Apr 11, 2012 7:05 AM
    Subject: Telestream Episode Streamlines Video Encoding for Cannes Film Festival
    To: <john.sokol@gmail.com>



    Dear John,

    Telestream Episode Streamlines Video Encoding for
    Cannes Film Festival


    Episode Engine automates encoding of all Web videos for Festival de Cannes

    Nevada City, Calif., April 11, 2012Telestream®, the leading provider of video transcoding and workflow solutions, today announced that its Episode® Engine application has been selected for the second year in a row to encode all Web videos for the Festival de Cannes. Episode was chosen for its ability to streamline the video encoding and delivery process, all within a single application. This year's Festival will be held May 16-27 in Cannes, France

    Click here to read more ...




    Telestream NAB Booth SL1405

    Press contact:
    Janet Swift, Telestream
    janet_swift@telestream.net
    +1 530-470-1328

    For Telestream images, logos & corporate fact sheet:
    online press kit

    Visit Telestream website:
    www.telestream.net

    Telestream RSS Feeds

     
    Recipient Tools
    Remove Me from Your List
     | 
    Send Feedback

    Sunday, April 08, 2012

    Dan Catt alleging that the New Aesthetic isn’t about 8bit retro, the Robot Readable World, computer vision and pirates. | Beyond The Beyond | Wired.com

    http://m.wired.com/beyond_the_beyond/2012/04/dan-catt-alleging-that-the-new-aesthetic-isnt-about-8bit-retro-the-robot-readable-world-computer-vision-and-pirates/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wired%2Findex+%28Wired%3A+Index+3+%28Top+Stories+2%29%29&utm_content=Google+Feedfetcher

    Volume Slicing Display

    This is fairly simple but hasn't been commercialized yet. AFAIK

    It consists of 1 or 2 video cameras, a projector, a sheet of frosted glass or plastic and a PC with a decent 3D Graphics cards.

    For software they use ARToolkit to track the position of the transparent hand held screen and use and Direct Draw, or OpenGL to render the image that's projected.

    At this point any standard Android tablet or IPad just by itself should be able to do this.

    Using the accelerometers and gyros alone is not enough. They need to be combined with additional data from the built in video camera to track the tablets movement relative to it's  environment.


    If someone wishes to pick up this challenge let me know.  I ask that you credit this this post with the idea and link to it.   I am also available on a paid bases.

    John L. Sokol




    The Volume Slicing Display enables interactive exploration of volumetric data using a rigid, passive, untethered screen. The system tracks the screen (a piece of plexiglas or paper) using a custom monocular high-speed vision system (Vision Chip) or using ARToolkit markers; then one or more projectors on the room project the corresponding slice of a 3D virtual object on that surface in real time. This experimental interface will enable multiple users to feel as if 3D virtual objects co-exist in real space, as well as to explore them interactively using cheap passive projection surfaces (plexiglas or even paper). This project is related to the "Khronos Projector" and the "Deformable Workspace", but is more oriented towards medical imaging technologies.

    http://www.k2.t.u-tokyo.ac.jp/perception/VolumeSlicingDisplay/index-e.html

    Saturday, April 07, 2012

    Sony RayModeler 360 Autusteroscopic Display Prototype

    Autostereoscopic Display
    The RayModeler 3D display 

     LED light sources allow you to see an image from all angles, 360 degrees. Objects like faces and people appear realistic giving viewers a sense of depth because the left and right eyes are seeing different images. I saw a demo of the display recently at the Sony offices and looking at moving 3D holographic images was almost like looking into a crystal ball! You’ll notice in the video the images even react and move when prompted by the wave of a hand. Though the device can’t let you see into the future, it is the future of how we will display objects, game or even see advertisements or displays at stores.




    http://blog.sony.com/sony-prototype-360%C2%B0-3d-display
    http://blog.sony.com/raymodeler-3d-prototype-will-be-showcased-at-siggraph
    http://www.youtube.com/watch?v=6BFKC-NKRFw

    http://en.wikipedia.org/wiki/Volumetric_display

    Using Kinect for eye tracking

    "The technology works by figuring out where you’re sitting using a Microsoft Kinect camera, then guessing approximately where your right and left eyes are, and flashing images quickly on the screen targeted at each eye, which creates a 3D image in your brain."

    Full 1080P Camcorder HDV-320 $323.22 US

    It's just amazing how fast the prices are falling on this technology.

    http://www.lightinthebox.com/Highest-Resolution-Full-1080P-Camcorder-HDV-320-10-0MP-CMOS-20-0MP-Enhanced-with-3-0inch-LCD-Display-120X-Zoom--DCE303-_p106780.html

    Legend3D professional Film industry 2D into 3D

    Legend3D, Inc., is a leading digital media technology company that specializes in using its proprietary image processing technology and artistry to create greater value from entertainment assets for its clients. Legend3D’s technology enables the conversion of motion pictures, television shows, advertisements and other visual entertainment assets produced in 2D into 3D with a result that rivals, and in some cases exceeds, the quality of native 3D content. The market for 3D conversion and content has recently experienced rapid growth and Legend3D has become the largest and most experienced provider in the nascent 2D-to-3D conversion industry.


    Clients include DreamWorks Animation, Warner Bros. Pictures, Walt Disney Pictures, Lionsgate, Sony Pictures, Screen Gems and Paramount Pictures.


    http://www.legend3d.com

    2D to 3D Conversion Signal Video Converter Box Set for TV Movie Blue Ray Xbox360

    http://www.meritline.com/2d-to-3d-conversion-signal-video-converter-box-set---p-78657.aspx

    $63.99  


      2D to 3D Conversion Signal Video Converter Box Set for TV Movie Blue Ray Xbox360

      Details:
    • 2D to 3D signal video converter box
    • Convert 2D to 3D stereo image
    • Make you watch 3D video on common TV monitor
    • Multi-purpose design makes your different needs:
      Converts 2D content to 3D "Anaglyphic" Amber/Blue format for any 2D TV with HDMI input viewed with Amber/Blue Glasses
      Converts 3D "Side-by-Side-Half" (SBS-H) content to 3D "Anaglyphic" format for any 2D TV with HDMI input viewed with Amber/Blue Glasses (HDMI cable not included)
      Converts 2D content to "Side-By-Side-Half" 3D format for shutter glasses type 3D HDTV
      Converts 2D content to "Line-By-Line" 3D format for polarized glasses type 3D HDTV
      Converts 2D content to "Frame Sequential" format for 3D DLP Projector
      Converts 2D content to 3D with a key switch
    • 3D output color adjustment (Half color/Full color/Optimum, programmed by firmware)
    • 3D stereo effect adjustment (Convergence: Inward/Middle/Outward, programmed by firmware)
    • 3D depth effect adjustment (Index: Weak/Medium/Strong, by a key switch)
    • Supported devices for inputs: XBOX360/ PS3/ 2D Blue Ray DVD/ HD Player/ DVD/ HD-STB etc.
    • Support 480p /576p/720p/1080p solution
    • Remote control powered by a CR2025 button cell (included)
    • Offers a more natural and convenient way to enjoy 3D images
    • Input: AC 100-240V, 50/60HZ, 0.2A
    • Output: DC 5V, 1A
    • Max power consumption: 5W
    • Package Contents:
    • 1 x Video Converter
    • 2 x 3D Glasses
    • 1 x User Manual
    • 1 x Charger
    • 1 x Remote Controller

    Multivew extraction made easy for AutoStereoscopic displays

    Besides the software making it easier to render multiview images for auto stereo displays it has some other significant advantages:

    Software solutions for 3D Film-making, referencing the book "Think in 3D: Food for Thought for Directors and Cinematographers".

    Cool Tools: InjectIR HDMI/IR Adapter

    http://www.kk.org/cooltools/archives/005836.php

    Remote Control HDMI/IR Adapter

    Inject IR HDMI/IR Adapter

    When you're hanging an HDTV on a wall, the biggest pain is fishing wires through walls to the video equipment that's hidden in a closet or another room. It is necessary to run an HDMI cable through the wall for the audio/video, but the other pain is figuring out how to control all of your hidden equipment with your remote controls since remotes require a direct view to work (like Blu-Ray player and TIVO in the case of my parents). This tool is really cool because it lets you use the HDMI cable that is already installed to relay the IR signal back to your equipment so I don't have to run a second wire through the walls.

    I'm the one in my huge obnoxious family that everyone asks to wire their house with an ethernet network, surround sound system, home theater set-up, etc. This is my third home theater set-up job, the other two I've used a wiring system that requires you to screw a connector block to the closet where the equipment is and to run these wires through the wall. That took hours. This kit takes a couple minutes to set up. All you really have to do is unplug your HDMI cable from your TV, plug in the IR injector adapter, and replug the HDMI cable into the IR injector adapter (and then repeat on the video equipment side). Then, I just took the IR blaster that is on the equipment side and pointed it in the direction of the video equipment, and it works like a charm 100% of the time.

    In the past I have used the wired kit from Cables to Go a couple times. It is a really reliable kit, but it is almost twice as expensive and takes twice as long to install. I've also seen other wireless models but read in forums that those convert the IR signal into an RF signal that can go through walls, but that the RF signal is too similar to your wifi signal and so the interference makes it not work 100% of the time.

    Cool Tools: TV-B-Gone

    http://www.kk.org/cooltools/archives/001135.php

    Television Eliminator

    TV-B-Gone

    Switch off thousands of TVs using just one small remote! When you want some peace and quiet in that local bar of restaurant or office all you need to do is hit the TV-B-Gone button. I've used it in bars and clubs, and in the headquarters building of a large South African bank which had too many TV's on the walls and some of which needed to switched off. It really does work.

    [When you press the button, TV-B-Gone takes slightly more than a minute to emit more than 200 popular shutdown codes, like trying every possible combination to open a safe. The instructions include a diatribe against television in general, as if using this product is not merely a prank, but a serious political act. -- CP]

    TV-B-Gone $20

    Available from Amazon

    Friday, April 06, 2012

    Stop YouTube (s.ytimg.com) Video Camera Spying

    Did you know that your camera enabled computer can spy on you? Did you know that by watching that video or playing that cute game you may be allowing a site to watch you? Did you know that YouTube (via s.ytimg.com) seems to be one such site?
    Lean how stop this here.

    http://www.keiths-place.com/blogs/keith/2008/stop-youtube-sytimgcom-video-camera-spying

    Stolen Samsung AMOLED Display Technology

    Wow, I look forward to reading about the details. It's like out of some novel.

    11 Arrested for Stealing, Selling Samsung Display Technology
    http://www.wired.com/gadgetlab/2012/04/11-arrested-for-stealing-selling-samsung-display-technology

    Stolen Samsung AMOLED technology sold to rival, 11 suspects arrested
    http://www.bgr.com/2012/04/05/stolen-samsung-amoled-technology-sold-to-rival-11-suspects-arrested/

    Turn Your Boring HD Videos Into 3-D Masterpieces on YouTube

    http://www.wired.com/gadgetlab/2012/04/turn-your-boring-hd-videos-into-3-d-masterpieces-on-youtube

    3D at the push of a few buttons. Photo: YouTube
    Unless you’re James Cameron or Peter Jackson, sharing a 3-D video with your friends is damn near impossible. Which is a shame because that video of you falling down the stairs demands a third dimension.
    YouTube wants you to share your amazing HD action videos in glorious 3-D with the push of a button. Beginning Thursday, all HD videos on YouTube have the option to be encoded in 3-D.
    Before you start looking for videos in your YouTube account to convert to 3-D, consider the following, as the process isn’t completely turn-key.
    First, the feature only works with HD videos (all the HD videos in my YouTube account, including videos uploaded over a year ago, included an option for 3-D viewing). To enable the 3-D viewing option for end users, the feature can be set in the “Edit Info” section of each video.
    Second, in order for an end user to see a video in 3-D, he or she will need to set playback to HD as well. At this point, the 3-D feature appears in the options area below the timeline. Once 3-D is enabled, there are a variety of 3-D viewing options. For stereoscopic, color-based glasses, users can opt for red/cyan, green/magenta, and blue/yellow. Interleaved glasses are also supported. Side-by-Side view — the eye-crossing “no glasses” features — and HTML5 Stereo view are also available.
    The steps necessary to view a video in 3-D seem a bit convoluted. But, hey, it’s 3-D. Having one button that adjusts the playback to HD, and automagically turns on 3-D probably would have been too simple.
    Google explained how they take your boring 2-D video and turn it into in-your-face 3-D:
    – We use a combination of video characteristics such as color, spatial layout and motion to estimate a depth map for each frame of a monoscopic video sequence
    - We use machine learning from the growing number of true 3-D videos on YouTube to learn video depth characteristics and apply them in depth estimation
    - The generated depth map and the original monoscopic frame create a stereo 3-D left-right pair, that a stereo display system needs to display a video as 3-D
    Expect an onslaught of videos of various individuals punching at their cameras in the next few weeks as users test the service.