The future! Now!!
Keep the internet good and weird. Join the fight for #netneutrality.
9to5.tv
Art exhibition has announced an open call for submissions for projects which incorporate livestreaming:
9to5 is a month-long digital art exhibition in Atlanta that dissolves the boundary between artist and audience by way of an experimental livestream and other emerging interfaces.
Unlike other art exhibitions, the ideal 9to5 experience is online, where patrons can interact with the projects and performances broadcasted, and influence the final artworks using our custom built suite of inputs.
If you have a potential project that utilizes livestreaming and / or digital inputs to blur the line between ‘viewer’ and ‘collaborator’, we want to hear from you. From data visualization to interpretative dance, podcasts to prose, AI to the humanities, all artists and technologists with an interest in experimentation are welcome.
Talent will have access to a ton of new technology courtesy of Georgia State University.
For talent that needs it, travel to Atlanta will be covered by 9to5, along with lodging and food.
All submissions will be considered, but there are limited slots available in each category.
APPLY BEFORE 17 JULY 17
Cinema without people: The Conformist (1970, Bernardo Bertolucci, dir.)
Take a journey through colors and shapes with this animated short.
Natural Human-Drone Interaction
Research project from Eirini Malliaraki illustrates ideas for drone programming, from gesture to emotion recognition:
1-month graduate project // Royal College of Art & Imperial College// May 2017
Taking inspiration from the interaction between falconers and their birds of prey, as well as from common daily gestures, cybernetics, dance, and robotics, several themes were explored, namely: - a gesture-based interaction scheme that attempts to create a more intuitive and natural way to communicate with aerial robots - ways in which aerial robots can become more autonomous by interpreting their environment in richer ways - ways in which they can communicate their intentions and give feedback - ways in which an aerial robot can understand and react to human emotions and eventually influence our behaviour
Parrot AR Drone, Node js, Javascript, Affectiva Emotion analysis SDK
625
Bold & Bright!
I like it!!
:)
Aston Martin DBR1, 1956. One of only 5 ever made is to be auctioned by RM Sotheby’s at the Pebble Beach Concours d'Elegance. This is the first time that a DBR1 has been offered for public auction, with Sotheby’s expecting it to top $20 million. That could make it the most valuable Aston Martin ever sold at auction and the most valuable British car of any marque
Photographs by Tim Scott
Bill Watterson (via h-o-r-n-g-r-y)
Computational Video Editing for Dialogue-Driven Scenes
Research paper from Standford University and Adobe Research provides proof of concept system to simplify and automate video editing processes:
We present a system for efficiently editing video of dialogue-driven scenes. The input to our system is a standard film script and multiple video takes, each capturing a different camera framing or performance of the complete scene. Our system then automatically selects the most appropriate clip from one of the input takes, for each line of dialogue, based on a user-specified set of film-editing idioms. Our system starts by segmenting the input script into lines of dialogue and then splitting each input take into a sequence of clips time-aligned with each line. Next it labels the script and the clips with high-level structural information (e.g., emotional sentiment of dialogue, camera framing of clip, etc.). After this pre-process, our interface offers a set of basic idioms that users can combine in a variety of ways to build custom editing styles.