Quantcast
Channel: Leap Motion Blog
Viewing all 481 articles
Browse latest View live

How to Integrate Leap Motion with Arduino & Raspberry Pi

$
0
0

For hardware hackers, boards like Arduino and Raspberry Pi are the essential building blocks that let them mix and mash things together. But while these devices don’t have the processing power to run our core tracking software, there are many ways to bridge hand tracking input on your computer with the Internet of Things. You’ll just need to interface the computer running the Leap Motion software with your favorite dev board!

In this post, we’ll look at a couple of platforms that can get you started right away, along with some other open source examples. This is by no means an exhaustive list – Arduino’s website features hundreds of connective possibilities, from different communication protocols to software integrations. Whether you connect your board directly to your computer, or send signals over wifi, there’s always a way to hack it.

Platform Integrations

iothero-blog

Wireless Control with Cylon.js

For wireless-enabled controllers, it’s hard to beat the speed and simplicity of a Node.js setup. Cylon.js takes it a step further with integrations for (deep breath) Arduino, Beaglebone, Intel Galileo and Edison, Raspberry Pi, Spark, and Tessel. But that’s just the tip of the iceberg, as there’s also support for various general purpose input/output devices (motors, relays, servos, makey buttons), and inter-integrated circuits.

On our Developer Gallery, you can find a Leap Motion + Arduino example that lets you turn LEDs on and off with just a few lines of code. Imagine what you could build:

"use strict";

var Cylon = require("cylon");

Cylon.robot({
  connections: {
    leap: { adaptor: "leapmotion" },
    arduino: { adaptor: "firmata", port: "/dev/ttyACM0" }
  },

  devices: {
    led: { driver: "led", pin: 13, connection: "arduino" }
  },

  work: function(my) {
    my.leapmotion.on("frame", function(frame) {
      if (frame.hands.length > 0) {
        my.led.turnOn();
      } else {
        my.led.turnOff();
      }
    });
  }
}).start();

 
vuo-map

 
Visual Scripting with Vuo

Vuo is a visual scripting platform that features the ability to connect a wide variety of inputs and outputs. The latest version includes support for Serial I/O, which opens up access to a range of devices, including Arduino boards. Read more about Vuo in our featured blog post or learn more from their feature request thread.

Open Source Examples

Arduino + 3D Printer

This Processing sketch from Andrew Maxwell-Parish lets you control a Makerbot 3D printer with hand movements. It works by gathering the XYZ position of your fingers from the Leap Motion API, converting it into the Cartesian coordinates needed for the 3D printer, packaging it into G-code format, and sending it to the printer via an Arduino-based controller.

blast-time

Arduino Motor Shield + Clock Motors

The Arduino Motor Shield is an additional kit that makes it easy for you to control motor direction and speed with your Arduino. (Check out this great Instructables tutorial to set it up.) Using the Motor Shield and a Processing script, interactive design student James Miller created BLAST: Time – a little sculpture with a clock that responds to people’s hand movements.

Robotic Arm

Built in Python, this award-winning hack from last year’s LA Hacks featured a robotic arm with five different motors. Leap Motion data was sent to the arm controller through a Bluetooth connection, allowing the team to place in the competition finals.

la-hacks-drone

Hacking a Radio Drone

When it comes to DIY robotics, taking the sneaky route is often half the fun. Using an Arduino-to-radio interface, another LA Hacks team was able to hack a drone’s radio signal, taking control from analog to digital. According to hardware hacker Casey Spencer, “remote control through the Internet, computer-aided flight, or even – with a little more hardware – complete autonomy is now possible for the majority of consumer drones already out there.”

Are you using Leap Motion interaction to help trigger the robo-apocalypse? Let us know in the comments, or share your project on the community forums!

The post How to Integrate Leap Motion with Arduino & Raspberry Pi appeared first on Leap Motion Blog.


Augment Your Arm: Designing 3D Printed Wearables on Your Skin

$
0
0

As our physical reality becomes increasingly augmented, creative coders are able to access a whole new trove of intriguing possibilities. Several weeks back, we stumbled upon one such experiment called TACTUM, an unusual combination of projection mapping, motion controls, depth sensing, and 3D printing to create customized wearables. With all that technology, the design process is surprisingly simple – all you need is the light on your skin.

TACTUM is the creation of research and design studio MADLAB.CC. Earlier this week, we caught up with head designer and researcher Madeline Gannon to find out more about the mixed-media work, as well as her artistic process.

tactum-1

What’s the hardware setup behind TACTUM?

TACTUM is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touch, poke, rub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms.

tactum-2

What inspired you to incorporate Leap Motion technology into TACTUM?

We first implemented TACTUM using a Microsoft Kinect. However, our second version switched to a Leap Motion Controller. Pragmatically, this let us test whether our system was generalizable to many kinds of depth sensor. The speed and accuracy of the Leap Motion Controller also made it much easier and more reliable to projection map our digital geometry onto a moving body.

Using its skeletal tracking capabilities, we were able to dynamically project digital content onto a moving arm, and we used it as a touch sensor to detect and track tactile interactions with the body. The goal of TACTUM was to create a gestural modeling tool that did not rely on mid-air interactions. Instead, we used the controller to detect how a person is touching, pinching, or poking their arm, and we use these gestures to modify interactive geometry in our modeling environment.

tactum-jewelry-watch

What tools or resources did you use in building TACTUM?

We used the Leap Motion Java API to create our skeletal tracking and touch gesture detection, we used Processing and the Toxiclibs library to create our modeling environment and interactive geometry, and we used OpenCV to calibrate our projection mapping.

To keep up with Madeline’s latest projects and research, follow her on Twitter, check out her online portfolio, or read her co-authored white paper on skin-centered design.

Want to see another light-bending art project? See how Felix Faire was able to transform any surface into a musical instrument.

The post Augment Your Arm: Designing 3D Printed Wearables on Your Skin appeared first on Leap Motion Blog.

Featured Platform: Build Visually Stunning Experiences with Processing

$
0
0

One of the most powerful things about the Leap Motion platform is its ability to tie into just about any creative platform. That’s why we’ve launched a Platform Integrations & Libraries showcase where you can discover the latest wrappers, plugins, and integrations.

Among developers, interactive designers, and digital artists, Processing is an enormously popular way to build compelling experiences with minimal coding. We’ve seen hundreds of Leap Motion experiments using Processing, from Arduino hacks to outdoor art installations, and the list grows every week.

“Many people are interested in what programming can do, but get really frustrated when they start getting into things like tedious details of many languages. Processing abstracts a lot of that stuff away, so designers and artists can just focus on building.”

James Britt, aka Neurogami, is the developer behind the LeapMotionP5 library, which brings together our Java API with the creative power of Processing.

He’s just rolled out a major update to the library, including a new boilerplate example and a demo designed to bridge hand input with musical output. We caught up with James to ask about the library, his latest examples, and how you can get started.

Building a drawing program from scratch

One of the best features of Neurogami’s library is the ability to poll or callback depending on your preference. Polling with the draw() function is a popular approach for many beginners because it’s easy to get started, while more advanced Processing sketches frequently use callback handlers. However you learned to write in Processing, this library can do it.

To help get you started, Neurogami has created a simple drawing program to demonstrate how to reference the Leap Motion library, tell it to look for gestures and finger positions, and use that to draw to the screen.

Painter

OSC plugin for Renoise and beyond

Neurogami’s latest project is an OSC (Open Sound Control) sketch for Renoise and similar music creation programs – one that makes it easy to map Leap Motion input to all kinds of sonic output. His goal, he says, isn’t to create a virtual keyboard, but instead to make it easy for musicians to access the creative potential of hand tracking.

On our Gallery, you can find two different versions of the project: one that relies on the draw() loop, the other using callbacks from the Leap Motion software.

LEAP_Osc2

“The interest for me lies in how we can use the space we’re given, and what that means,” he says. “Playing the bass on piano is different from playing the bass on a bass guitar I create music differently when I’m using keyboard versus a monome. It all depends on how you and your body use the space.”

Seasoned musicians know that latency is an essential question during live performance. Since Processing tends to run around 60 frames per second, Neurogami points out that the overall latency is probably too high for hitting precise notes. “However, it’s great for triggering different cues or mixing in new beats.”

As for the future of the library, his philosophy is to see how people use it first. “Trying to anticipate how people might want to use something is a problem with a lot of open source projects. Ideally, someone will use my project and want to fork it or submit a patch to add additional functionality. I could spend a lot of time trying to add new features that no one wants That becomes a lot more code to support, and ultimately I want to pursue what interests me and actually use. If some has a feature request they need to make a case for it.”

Do you have a Processing project in the works, or looking to start one? Let us know about it in the comments, or get inspired with previously featured Processing projects.

The post Featured Platform: Build Visually Stunning Experiences with Processing appeared first on Leap Motion Blog.

Music Video Calls in the SWAT Team for Dirty Dish Crimes

$
0
0

Music videos have evolved significantly since TRL. Last week, we were thrilled to come across a new release from Darwin Deez called Kill Your Attitude, where the perils of modern love take some truly bizarre emotional and technical twists. Love (literally) becomes a battlefield when Darwin’s angry girlfriend becomes the player in a first-person shooter, taking the central conflict to some vividly imaginative heights as she hunts him down for great justice.

darwin2

As director Dent de Cuir (of Caviar Content) explained: “We thought it was interesting to design an FPS video game and use it as a narrative canvas to speak about little wars which occur during the lifespan of a relationship. Our original idea was to use video game footage and mash it up with in-camera footage, but very early on in the research phase we had a conversation with the lovely team at Ruffian – our post house for the project, who suggested Unity, a cross-platform game engine.

“Two hours later we had the software up and running and were doing location scouting in digital environments. Ruffian were crazy enough to jump into the idea of creating a custom video game to sit Darwin Deez in and just play with it. As directors, it was a privilege to work without any real constraints.”

darwin4

Lurid, painterly details and intricate, swooping shots seemingly captured from the tops of skyscrapers were all within easy reach of the team’s Unity engine. VFX and CGI studio Ruffian Post-Production created the entire game from scratch in Unity, using our Core Assets to build out interactions and Mixamo for character design.

“We had all these first-person hand movements that we needed to achieve. So we were like, ‘Wouldn’t it be cool if we could use Leap Motion to do it?’” said Christopher Watson-Wood, Ruffian’s Head of CG and ECD. “We played around with it and it worked. Without Leap Motion, we probably would have had to hand-animate every first-person hand, and as a result there probably wouldn’t have been as many.”

darwin3

In about two hours, Christopher was able to write a Unity plugin to export the camera data into After Effects. Once everything was rendered in Unity, the team composited their work as a separate layer in After Effects.

“In the future, we’d like to use the Leap Motion Controller as an animation controller, not for real time, but motion-capturing that data – and using it to control, like, monster’s tentacles or anything you want to assign an organic, custom movement to,” Christopher told us. “Traditionally, if you want to do motion capture, you do it full-body in a studio. Anything you can do on your desktop with a Leap Motion Controller saves a lot of time. I think Leap Motion easily fits into that kind of workflow where you don’t want to have to manually animate something with keyframes.”

Ruffian’s post-production team of four produced the experience in just a little over a month. With a feat like this out of such a short sprint, we’re excited to see what the future holds for projects out of this studio.

darwin5

DARWIN DEEZ – KILL YOUR ATTITUDE
Label: Lucky Number Music
Production Company: Caviar Content
Director: Dent de Cuir
Post Production (CGI, VFX, grade, edit): Ruffian Post

CREDITS: Artist: Darwin Deez Label: Lucky Number Commissioner: Stephen Richards Director: Dent De Cuir Executive Producer: Ore Okonedo Producer: Rohan Scully PA: Tom Ralph Production Company: Caviar London DOP: Alexandre Icovic Post: Ruffian Post Head of CG / ECD: Christopher Watson-Wood Executive Creative Producer: Amanda Jones Lead Editor / Compositor: Harry Davidson Lead CG Artist: Toby Williams-Ellis Editor / Director of Kill Your Attitude Behind the Scenes: Jack Tew

For further press details about the video, please contact Amanda at Ruffian Post amanda@ruffianpost.com.

The post Music Video Calls in the SWAT Team for Dirty Dish Crimes appeared first on Leap Motion Blog.

Firework Factory VR: Touch the Show this Fourth of July

$
0
0

Boom! The white globe in front of you explodes into an array of color and light. A fraction of a second later – whoosh! – glowing stars streak past your head, leaving you in their colorful wake.

Reaching toward the holographic interface, with the motion of a single finger, you take control of time itself. The firework slows. Stops. Then it begins to recede back to the center. You slow time again as the stars ease past you, watching as the firework surrounds you. Entropy turns on its head again, and the firework calmly implodes into a single white globe.

But how would this firework look in orange and yellow? Exploding in a spiral pattern? You casually switch between holographic menu panels to make some changes. You’re about to find out with Firework Factory VR.

firework-factory1

The Next Generation of User Interfaces

User interfaces (UIs) play a crucial role in virtual and augmented reality, especially at this early stage of the medium’s evolution. In many ways, Firework Factory VR is an experiment with UI design for VR – with the goal of creating a beautiful, easy-to-use, and approachable user interface.

firework1

At the core of the UI, you’ll find the Hover VR Interface Kit (also created by Aesthetic Interactive). Firework Factory VR builds upon the Hoverboard interface, adding three-dimensional icons, smooth flip/slide/fade transitions, and custom tab-shaped items. Hover VR interfaces use the Leap Motion Controller, providing hand-based interactions and strong sense of immersion in the virtual space. By standardizing on a “hover” selection method, these interfaces encourage consistent and reliable usability across all types of menus.

The Hover VR project began as a single interface, Hovercast, which places an arc-shaped beyond the fingertips of one hand. You can learn more about the core concepts behind Hover VR in my posts Power At Your Fingertips and Behind The Design.

Hovercast-Slider

Hover VR interfaces like Hovercast promote easy, intuitive, reliable menu interactions – remaining simple even as an application’s menus and options become more complex.

Engineering Your Own Explosions (And Sharing Them Too!)

Firework Factory VR gives you the power to create millions of unique firework combinations, then watch them explode all around you. In the “Factory,” you build fireworks by selecting colors, sizes, shapes, and styles. Each firework has two groups of settings (“A” and “B”) for customizing its stars, tails, and explosion patterns. Each change immediately affects the firework, so you can see exactly how the current selections will look.

firework2

The customizations don’t end with the fireworks, however. You can control the flow of time, moving forward or in reverse. You can set the position, motion, and direction of your “camera”. Ultimately, you’re in control of the experience. You could choose a fast, loud, in-your-face firework experience. Or you might prefer a slow, smooth, bullet-time perspective. The choice is yours – and you can transition seamlessly from one mode to the next.

You can also enhance your experience with the “share” feature. Every firework combination generates a unique Firework ID, which you can use to save, share, and load your favorites. The “share” feature also creates a high-quality image of your firework, making it easy to show off your best creations. Be sure to mention @FireworkVR on your Twitter posts!

firework-factory4 firework-factory3 firework-factory2

Example firework images, generated by the “share” feature.

Support Indie VR!

Firework Factory VR is available as a free demo, which gives you full access to the “experience” features (like controlling time and the camera), but limited options for customizing your fireworks.

Once VR devices are available to consumers, we’ll be making the full version of Firework Factory VR available for purchase. In the meantime, you can support Aesthetic Interactive’s exploration into new VR apps and tools by sharing this blog post, the app video, or your firework images on social media. Aesthetic Interactive is a small software development company with big ideas for how we’ll interact, create, and explore in virtual reality. We would really appreciate your feedback, participation, and support!

What’s Next?

The code and concepts behind Firework Factory VR provide a strong foundation for future projects. From this foundation, we plan to build applications focused on creativity, visualization, and learning – where VR and 3D input devices (like the Leap Motion Controller) make exciting new experiences possible.

The Hover VR Interface Kit will also evolve, with new interfaces and ways to improve usability. Other potential projects include things like a visual “harness” to guide gesture-based input, and a tool for natural-looking avatar movement using data from VR headsets and 3D input devices. If you are interested in learning more about any of these projects, feel free to reach out to @zachkinstner on Twitter!

This demo is just one small (but explosive!) step toward a larger vision for virtual reality experiences – exploring how they should look, feel, and function. How does it feel to be surrounded by fireworks? To use the menu interface? To control time itself? Let us know!

firework3

The post Firework Factory VR: Touch the Show this Fourth of July appeared first on Leap Motion Blog.

This is Your Brain. This is Your Brain in VR.

$
0
0

Have you ever received an MRI scan back from the lab and thought to yourself, “I’m not sure how even a medical professional could derive any insightful information from this blast of murky images?” You’re not alone. But what if, instead of having your doctor’s obtuse interpretation suffice, you could physically walk through your ailment with your doctor in VR, parsing and pointing out the nuances of pain felt within pieces of inflamed tendons or nerves or sections of your brain?

Brain Connectivity, a new example in the Developer Gallery, marks the beginning of a Master’s Thesis project from Biomedical Engineering student Filipe Rodrigues. The experiment uses slices of MRI scans and Multimodal Brain Connectivity Analysis to reconstruct a 3D model of the brain. Using tractography images, the user is able to see how regions of the brain are connected to each other. Based on this information, a matrix can be drawn of all the connected regions.

“Turning a boring 2D [MRI] visualization into a 3D interactive one is a great way of appealing to people who are maybe not from this discipline,” Filipe told us, “It’s a good way to visualize complex concepts and try to make them as appealing as possible. Leap Motion is a great tool for this.”

Screen Shot 2015-07-09 at 5.22.40 PM

The project was primarily built in Unity, utilizing our widgets to cue interaction design. The brain model itself consists of Magnetic Resonance Images processed with Freesurfer. The connectivity graphs are computed from Diffusion Tensor Imaging tractography data processed with Diffusion Toolkit / Trackvis and Brain Connectivity Toolbox.

While the visualization portion of the project has proven to be an interesting challenge in and of itself, Filipe hopes to expand his thesis far beyond this demo. “In the beginning, I wanted to develop a haptic glove that I could use with Leap Motion in a virtual reality scenario, allowing me to feel the stuff I touched.” He explained, “For example, when I interrupt brain connectivity, I want to feel the vibrating on my fingers. That was the initial goal for my thesis. I’m still working on it, but I got too involved with the software part of it. I’m loving working with Leap Motion so I’ve postponed the glove component of the project a bit.”

Filipe does, however, have a working prototype. He removed vibration motors from two broken down cell phones, then glued them to a glove and wired that to an Arduino. From there, he looped in Unity using a handy package called Uniduino. You never know when you’ll strike gold in the asset store.

In addition to haptics, Filipe and his team hope to take the project from a purely aesthetic to a hardcore scientific place. The goal would be that for every interruption your hand makes on the graph, information would pop up about exactly what you’re interacting with. He’d also like to expand the interaction design to make the project as accessible as possible for the general user.

“Ideally,” Filipe said, “we’d like to turn it into two apps. One that’s more focused on the aesthetics and the interaction part of it, and one that’s more focused on the science behind it.”

Want to dig around the experience for yourself? Download it here in the Leap Motion Developer Gallery.

The post This is Your Brain. This is Your Brain in VR. appeared first on Leap Motion Blog.

Leap Motion 3D Jam 2.0 Launches on Sept. 28th!

$
0
0

We’ve come a long way since we first launched the Leap Motion Controller two years ago. Today, we’re marking the occasion by announcing our second annual 3D Jam! For six weeks, starting on Sept. 28th, developers around the world will build innovative experiences for virtual reality, desktop, mobile, and beyond.

Since we released our technology to the world, we’ve been constantly working to bring new tools and assets to developers building with the Leap Motion platform. Resources like video passthrough, Image Hands, and UI Widgets are all small but fundamental steps in building the future of VR. We can’t wait to see what kinds of experiences you can build with them.

Last year’s competition was incredible, with over 150 submissions and some really amazing titles. For 3D Jam 2015, teams will compete in two tracks – Open and AR/VR. We’re giving away over $50,000 in prizes. (Update: now up to $75,000 and counting!) Entries will be accepted until November 9th, 2015 at 11:59:59 pm PST. (Full contest rules here.)

REGISTER NOW

AR/VR TRACK
Augmented and virtual reality experiences built on tethered HMDs1st Prize: $10,000
2nd Prize: $7,500
3rd Prize: $5,000
4th Prize: $2,500
5th Prize: $1,000
Community Favorites (2): $500 + Hardware
OPEN TRACK
All other platforms, from desktop and Android to the Internet of Things1st Prize: $10,000
2nd Prize: $7,500
3rd Prize: $5,000
Community Favorite (1): $500 + Hardware

Everyone who signs up for the 3D Jam will get a special code for 22% off anything in our web store. Teams who complete a submission by November 9th will get a refund for the cost of the hardware. Register now at developer.leapmotion.com/3djam.

3d-jam-rules-b8064b6c0ff9bd1638956812fb27bb34 3d-jam-forums-ab3b2301be1bb58d79dcd10c887f80e9 3d-jam-vr-guide-2c5117c5250c3f5dda34212bbea1a0e2
OFFICIAL RULES
Eligibility requirements and judging process
DEVELOPER FORUM
Share your projects and get support
VR GUIDE
Essential resources for VR development

To celebrate our two-year anniversary, we’re also taking 22% off all orders in our web store with the promo code 2YEARS22 (terms and conditions). This offer is only available for one week, so get it today.

We’re excited to see what you’ll create in the months ahead – be sure to share your progress with the hashtag #3Djam, tweet us @leapmotiondev on Twitter, like us on Facebook, and join the conversation on our subreddit. For up-to-the-instant updates, check out our official forum thread.

Dream big and build what inspires you. Good luck!

The Leap Motion team

The post Leap Motion 3D Jam 2.0 Launches on Sept. 28th! appeared first on Leap Motion Blog.

Live from Berlin! VR Workshops for Unity & JavaScript

$
0
0

Hey everyone! As part of our global tour for the Leap Motion 3D Jam, we’re at Berlin’s Game Science Centre to take developers through our SDK and building with the latest VR tools. Registrations for the workshops and meetup are still open. The livestream is happening today from 8am–1pm PT (5–10pm CET) at the top of this post – jump into our Twitch channel to join the chat session!

Ahead of the event, we thought we’d give you a quick overview of what to expect. Let’s take a light-speed look at VR development with Leap Motion in Unity and JavaScript.

Why Hands in VR? Escaping from Flatland

We believe that if virtual reality is to be anything like actual reality, then fast, accurate, and robust hand tracking will be absolutely essential. With the Leap Motion Controller, you can quickly bring your hands into almost any virtual reality experience. Our plugins and resources for Unity, Unreal, and WebVR include fully interactive virtual hands that can interact with objects in 3D scenes.

Before you start building, it’s important to know that designing for motion control involves a whole new way of thinking about interactions. Physics engines aren’t designed with hands in mind, and traditional UIs are built for 2D screens. Here are some key resources that will help you build compelling experiences that feel natural:

Unity3D

One of the world’s most popular game engines, Unity makes it easy to rapidly build and develop VR projects. Due to recent changes to the Oculus plugin in Unity, we currently recommend using Unity 5.0 and Oculus 0.5 for your project.

Cockpit4

Along with a full Leap Motion + Oculus integration and a variety of demo scenes, our Unity 5 core assets include several additional features that really stand out:

  • Image Passthrough: This gives you access to the raw infrared camera feed from the controller, letting you see the world in ghostly infrared.
  • Image Hands: These bring your real hands into virtual reality, using the live cameras instead of rigged models.
  • UI Widgets: Buttons, sliders, scrollers, and dials that provide the fundamental building blocks for your VR experience.

To get started, download the Core Assets package and dig into the included demo scenes. You can also explore a variety of Unity demos on our Developer Gallery.

JavaScript

The Leap Motion software also includes support for modern browsers through a WebSocket connection. LeapJS, our JavaScript client library, is hosted on a dedicated CDN using versioned URLs to make it easier for you to develop web apps and faster for those apps to load. You also have access to a powerful and flexible plugin framework to share common code and reduce boilerplate development. In particular, the rigged hand lets you add an onscreen hand to your web app with just a few lines of code.

Recently, we’ve been using experimental browsers to play around with virtual reality on the web. Mozilla provides the Oculus integration out of the box with a special VR Firefox build, while the latest Chrome builds are available here. For a quick boilerplate, be sure to check out our VR Quickstart demo, or reach into VR Collage to see a more complex project. Each of these projects is fully documented, and you can find a deep dive into the boilerplate on the Leap Motion blog.

Feeling inspired? Be sure to tune into today’s livestream from 8am–1pm PT (5–10pm CET) and register for the 3D Jam! We can’t wait to see what you’ll build.

The post Live from Berlin! VR Workshops for Unity & JavaScript appeared first on Leap Motion Blog.


Video Series: Taking VR Guitar to a Whole New Depth

$
0
0

As the 3D Jam approaches, developers around the globe are already getting a headstart on their projects. Zach Kinstner, the creator behind Hovercast and Firework Factory, has been sharing his latest project through a series of videos – a virtual reality guitar! We caught up with Zach this week to talk about his design process, especially the guitar’s unique combination of visual depth cues.

DevUp-2015-08-20b-864

Stage 1: Building the Guitar

In this first video, Zach dives quickly into Unity, having set up the MIDI guitar sounds and attached them to some simple visual “strings.” He adds a visual indicator to the fingertips to help show the user where the hand is in 3D space.

What’s the importance of these visual indicators?

Visual indicators are very helpful for understanding where your hands are in virtual space. They can provide a good sense of depth, show which parts of your hand are being used for input, and help you discover which objects in the scene are interactive. (I wrote about this in more detail in my Hovercast blog post.)

For the guitar project, there’s a “strum zone,” which is basically a 3D rectangle. To strum the guitar, your finger has to be within that zone. In that scenario, I thought it was important – necessary, really – to give the user a strong sense of depth. The visual indicators are helpful for keeping your strumming finger inside of the zone when you want to hit the strings, and outside of the zone when you don’t.

Screen Shot 2015-08-18 at 12.38.19 PM
These twinkles are the first in his visual clue experiments.

Why not just allow the hands at any depth to interact with the strings?

Without some depth restrictions, the guitar strings would make noise with just about any hand movement. While I’m not looking for total realism, I do want the strumming to feel somewhat natural. I play “real” guitar, so I’m using that experience as a general guideline. Sometimes you want to strum downward, or to strum upward, or to hit individual strings, or to move your hand without hitting the strings at all. The “strum zone” is a good way to achieve those goals.

Stage 2: Chord Selection

In his second update video, Zach added the ability to select chords using input from the user’s left hand.

What’s the thinking behind the chord selection design?

Working with 3D input devices like the Leap Motion, I try to find the simplest ways to accomplish a task, and also to utilize the strengths of the device. I have to consider which movements, poses, and gestures will work most reliably with the hardware, and how easily they can be learned by new users. I’ve done a significant amount of exploration with the “hover” interaction, and found that it works well, so I decided to try it for the chord selectors.

The layout of the chord selectors should seem familiar to guitar players. The selectors are arranged in a grid to match the first five notes of the first three strings. Essentially, you’re selecting the bass note of the chord, and the other strings update to form the full chord. I may also add the ability to switch between major and minor chord formations – possibly using the orientation of your hand or the “spread” of your fingers.

Stage 3: Additional Depth Cues for VR

When it comes to visual indicators in VR, there’s a delicate balance between being distracting and being easy to overlook. In the third video, Zach has added a few new types of visual indicators that hit this crucial balance.

zach-guitar

Are you concerned that visual indicators on the edge of the screen risk causing distraction in the periphery of your vision?

I see it the other way – placing them at the edges of vision helps avoid distraction. My first attempt at a depth indicator (in the second video) placed graphics on the front and back sides of the “strum zone.” This worked well for slow, deliberate tests, but not as well in an actual guitar-playing scenario. The indicators were too subtle to see clearly when moving quickly, and making them brighter or bolder meant more clutter in front of the strings and selectors.

You made several “layers” of indicators in this update video. Red/yellow/green ones to show depth and whether the hands are in the “strum” space. White highlight blocks to help show you where your chord hand was in space. And light grey blocks to show on the sides where the strings lined up.

That’s a somewhat complicated combo, yet it seems to work beautifully. What was your thought process here?

Thanks! So far, I agree – I think I’m on the right track with this latest concept. Wrapping the visual indicators around the “strum zone” allows them to be easier to see – or maybe, easier to perceive – without obstructing your view of the main interactions.

My goal is for the user to be aware of these visual indicators, and find them helpful, without really paying attention to them. For example, you might be focused on your chord selections, but with the flash of red near the edge of your vision, you immediately know when, and where, you have moved outside the “strum zone”.

Of course, the design of these indicators is still in an early phase. As I refine them, I anticipate that certain colors or shapes or sizes will be more effective than others, and that each “layer” will retain distinct visual cues. All of these elements need to be balanced properly to make this “complicated combo” work well.

For the latest on Zach’s VR Guitar, be sure to watch his 3D Jam project thread on our community forums. What do you think of the demo so far? Let us know in the comments!

The post Video Series: Taking VR Guitar to a Whole New Depth appeared first on Leap Motion Blog.

Controlling the Physical World with Leap Motion and Raspberry Pi

$
0
0

When the Leap Motion Controller made its rounds at our office a couple of years ago, it’s safe to say we were blown away. For me at least, it was something from the future. I was able to physically interact with my computer, moving an object on the screen with the motion of my hands. And that was amazing.

Fast-forward two years, and we’ve found that PubNub has a place in the Internet of Things… a big place. To put it simply, PubNub streams data bidirectionally to control and monitor connected IoT devices. PubNub is a glue that holds any number of connected devices together – making it easy to rapidly build and scale real-time IoT, mobile, and web apps by providing the data stream infrastructure, connections, and key building blocks that developers need for real-time interactivity.

With that in mind, two of our evangelists had the idea to combine the power of Leap Motion with the brains of a Raspberry Pi to create motion-controlled servos. In a nutshell, the application enables a user to control servos using motions from their hands and fingers. Whatever motion their hand makes, the servo mirrors it. And even cooler, because we used PubNub to connect the Leap Motion to the Raspberry Pi, we can control our servos from anywhere on Earth.

Raspberry-Pi-Leap-Motion-Servos-Gif_smaller

In this post, we’ll take a general look at how the integration and interactions work. Be sure to check out the full tutorial on our blog, where we show you how to build the entire project from scratch. If you want to check out all the code, it’s available in its entirety in our project GitHub repository and on the Leap Motion Developer Gallery.

raspberry-pi-leap-motion-controller-servos

Detecting Motion with Leap Motion

We started by setting up the Leap Motion Controller to detect the exact data we wanted, including the yaw, pitch, and roll of the user’s hands. In our tutorial, we walk through how to stream data (in this case, finger and hand movements) from the Leap Motion to the Raspberry Pi. To recreate real-time mirroring of the user’s hands, the Leap Motion software publishes messages 20x a second with information about each of your hands and all of your fingers via PubNub. On the other end, our Raspberry Pi is subscribed to the same channel and parses these messages to control the servos and the lights.

Controlling Servos with Raspberry Pi

In the second part of our tutorial, we walk through how to receive the Leap Motion data with the Raspberry Pi and drive the servos. This part looks at how to subscribe to the PubNub data channel and receive Leap Motion movements, parse the JSON, and drive the servos using the new values. The result? Techno magic.

raspberry-pi-leap-motion-controller-1024x585

Wrapping Up

We had a ton of fun building this demo, using powerful and affordable technologies to build something really unique. What’s even better about this tutorial is that it can be repurposed to any case where you want to detect motion from a Leap Motion Controller, stream that data in realtime, and carry out an action on the other end. You can open doors, close window shades, dim lights, or even play music notes (air guitar anyone?). We hope to see some Leap Motion, PubNub, and Raspberry Pi projects in the future!

The post Controlling the Physical World with Leap Motion and Raspberry Pi appeared first on Leap Motion Blog.

3D Jam: Now with Over $75K in Prizes!

$
0
0

Our second annual 3D Jam kicks off in just a few weeks, and it’s bigger than ever! Today we’re excited to announce new prizes for competitors, bringing up our prize total to over $75,000. And we’re just getting started.

Beginning September 28th, developers around the world will compete to build the most amazing motion-controlled experiences for desktop, AR/VR, the Internet of Things, and beyond. The competition runs for 6 weeks, with registration open now. Everyone who signs up for the 3D Jam gets a special hardware discount code when they register, and teams who complete their submission by the November 9th deadline get their hardware cost refunded. See our updated official rules for details.

Thanks to the generous teams at Unity, OSVR, and NVIDIA, jammers now have the chance to win the following along with $50,000 in cash prizes:

  • 2 Unity Suites
  • 9 Unity Pro licenses
  • 6 OSVR Hacker Dev Kits
  • 6 NVIDIA GeForce GTX 980 Ti graphics cards

3djam-2015-prizes

Prize Breakdown

AR/VR TRACK
Augmented and virtual reality experiences built on tethered HMDs

1st Prize
$10,000
Unity Suite
2 OSVR HDKs
NVIDIA GeForce GTX 980 Ti

2nd Prize
$7,500
Unity Pro
OSVR HDK
NVIDIA GeForce GTX 980 Ti

3rd Prize
$5,000
Unity Pro
OSVR HDK
NVIDIA GeForce GTX 980 Ti

4th Prize
$2,500
Unity Pro
OSVR HDK

5th Prize
$1,000
Unity Pro
OSVR HDK

Community Favorites (2)
$500
Unity Pro

OPEN TRACK
All other platforms, from desktop and mobile to the Internet of Things

1st Prize
$10,000
Unity Suite
NVIDIA GeForce GTX 980 Ti

2nd Prize
$7,500
Unity Pro
NVIDIA GeForce GTX 980 Ti

3rd Prize
$5,000
Unity Pro
NVIDIA GeForce GTX 980 Ti

Community Favorite (1)
$500
Unity Pro

Unityunity-logo is an incredible game development engine that lets you build for almost any platform. Our Unity Core Assets make it easy for developers to get started with desktop and VR/AR, including image passthrough, Image Hands, and UI Widgets. Imagine what you could build with a one-year professional license and the power of Unity at your fingertips.

osvr-hacker-dev-kitOSVR is both an open source VR development platform and an upcoming development kit. Their goal is to create an open framework that brings input devices, games, and output devices together.

nvidia-logoFinally, whether you want to experience the future of VR, or you just want a kickass gaming rig, the GeForce GTX 980 Ti graphics card from NVIDIA is the way to go.

The 3D Jam Tour

This month, we’re also hitting up DC, Philly, New York, Boston, and Toronto as part of our global 3D Jam Tour. Our European tour in August packed meetups and workshops in Cologne, Berlin, and London – with lots of developers geared up and ready for the competition. Check out our Meetup page for more details!

leap-motion-3d-jam-events

Are you ready to take your place in the 3D Jam? Register for the 3D Jam online now to gear up and get started, and stay tuned for more announcements in the coming weeks!

The post 3D Jam: Now with Over $75K in Prizes! appeared first on Leap Motion Blog.

From Objects to Scenes to Stories: The Magic of 3D

$
0
0

What makes a collection of pixels into a magic experience? The art of storytelling. At the latest VRLA Summer Expo, creative coder Isaac Cohen (aka Cabbibo) shared his love for the human possibilities of virtual reality, digital experiences, and the power of hugs.

Isaac opens the talk by thinking about how we create the representation of 3D space in the digital world of ones and zeros – a place where nothing really exists, but everything is possible. Just connecting a series of one-dimensional dots can create a line, a plane, a fractal, or even things completely outside our everyday understanding.

objects-scenes-stories

He then dives into the dimension of storytelling through crafting and chaining together imaginary objects, and how perspective can be emotionally powerful. Like climbing to the top of a mountain and seeing how everything in your world is interconnected, depth and perspective can take experiences to an emotional, visceral level.

Isaac’s imagination is synesthetic, combining music and visuals in multiple dimensions. Pulsing space creatures with shimmering tendrils. Psychedelic jellyfish created from the structure of sound. Living comets around a dying star. It’s possible to give these creatures life within a graphics card, and expression through a web browser, but it’s the connections between them which gives them meaning.

“This allows the opportunity to tell more in that story. To provide more depth. To provide more perspective. To let people rise above the void that separates them from other people, walk around that, and give their homie a hug. That’s what we have to strive for – let humans be more human with other humans in a more real way.”

At this point, Isaac travels to the desolate world of Pulse, where users can connect points and create dimensions themselves as the story progresses. The world starts dark, with a rigid circuit board city and a distant moon, but springs to incredible life. (Isaac’s journey through the world of Pulse starts at 18:21.)

image01

“It’s like giving someone the opportunity to participate in that movement between dimensions, because it is so, so, so much more magical… to be inside there. The object is used to provide a context to other objects, to create a scene. But then somehow you can use a bunch of scenes to provide context for each other to make a story.”

During the second half of the talk, he turns to one of his latest projects – Enough, a children’s storybook in WebGL. From the foreword:

“It’s difficult to describe the joy that I found from picking up a picture book and reading it cover to cover. They let me explore galaxies, ride dinosaurs, slay dragons. They let me dig deep down into my own being as I wished upon a magic pebble, boarded a train bound for the north, or soared through the sky on a plane made from dough.

“I know I can never recreate the splendor, magnificence, or beauty that I found in these majestic works, but I hope that this project will still remind you of the wonder you found in these moments. Those times when you could be anything, go anywhere, and find magic in the most fragile of places.”

Enough

Whether it’s a movie, a game, or a story around a campfire, storytelling works by building and bridging scenes to create a narrative thread. And when everything comes together, it’s nothing short of magic.

The post From Objects to Scenes to Stories: The Magic of 3D appeared first on Leap Motion Blog.

The Essential 3D Jam Development Guide

$
0
0

With the 3D Jam just around the corner, we thought we’d give you a headstart – with a full guide to the very latest resources to bring your to life. In this post, we’ll cover everything you need to know about our integrations and best practices for augmented and virtual reality, desktop, and the Internet of Things.

A quick note: VR/AR is a rapidly emerging ecosystem, and many of the engine tools and features that we use to build our Unity and Unreal assets are constantly shifting. We have major updates for these assets in the works, so stay tuned!

Getting Started Checklist

  1. Create a developer account to download the SDK and integrations you’ll need to start building
  2. Check out the 3D Jam official rules
  3. Get inspired by the 2014 3D Jam and our Developer Gallery (now with tags for VR, open source, and more)
  4. Join the discussion on our developer forums
  5. Get connected on Facebook, Twitter, Instagram, and Twitch
  6. Share your project ideas with the hashtag #3DJam!
  7. Add developers@leapmotion.com to your email contacts list to make sure you receive important updates

Design 101: Escaping from Flatland

Whatever platform you’re building on, it’s important to know that designing for motion control involves a whole new way of thinking about interactions. Physics engines aren’t designed with hands in mind, and traditional UIs are built for 2D screens. Here are some key resources that will help you build compelling experiences that feel natural:

Introduction to Platforms & Integrations

With APIs for six programming languages and dozens of platform integrations, the Leap Motion SDK has everything you need to get started. In this section, we’ll cover our community’s three most popular environments for deskop and VR/AR: Unity, Unreal, and JavaScript. You can find more platforms for artists, creative coders, designers, and more on our Platform Integrations & Libraries.

See all integrations

To get started with VR/AR development, make sure you also check out our VR Getting Started page. Along with a setup guide, it includes links to more resources, including demos and documentation.

cockpitgif

Unity3D

Unity is a powerful game engine that makes it easy to rapidly build and develop VR and desktop projects. Here’s a four-minute guide to building your first Unity VR demo:

Along with a full Leap Motion + Oculus 0.7 integration and a variety of demo scenes, our core assets include several additional features that really stand out:

  • Image Passthrough: This gives you access to the raw infrared camera feed from the controller, letting you see the world in ghostly infrared.
  • Image Hands: Bring your real hands into virtual reality using the live cameras instead of rigged models.
  • Photorealistic Rigged Hands: Use these to bring realism into your desktop projects.
  • UI Widgets: Buttons, sliders, scrollers, and dials that provide the fundamental building blocks for for menus and interfaces.

Get started with Unity

Unreal

Right now, we recommend that everyone working with Unreal Engine build with getnamo’s community plugin, which includes Unreal 4.9 and VR support. It’s important to know that we’re still building out the full feature set for the Unreal plugin, so be sure to watch this forum thread in the coming weeks as we continue to bring our Unreal integration up to speed!

Get started with Unreal

LeapJS and WebVR

The Leap Motion software also includes support for modern browsers through a WebSocket connection. LeapJS, our JavaScript client library, is hosted on a dedicated CDN using versioned URLs to make it easier for you to develop web apps and faster for those apps to load. You also have access to a powerful and flexible plugin framework to share common code and reduce boilerplate development. In particular, the rigged hand lets you add an onscreen hand to your web app with just a few lines of code.

Recently, we’ve been using experimental browsers to play around with virtual reality on the web. Mozilla provides the Oculus integration out of the box with a special VR Firefox build, while the latest Chrome builds are available here. For a quick boilerplate, be sure to check out our VR Quickstart demo, or reach into VR Collage to see a more complex project. Each of these projects is fully documented, and you can find a deep dive into the boilerplate on the Leap Motion blog.

Get started with JavaScript

iothero-blog

Internet of Things

For hardware hackers, boards like Arduino and Raspberry Pi are the essential building blocks that let them mix and mash things together. And while these devices don’t have the processing power to run our core tracking software, there are many ways to bridge hand tracking input on your computer with the Internet of Things.

Note: If you’re looking to submit an IoT hack to the 3D Jam, please check out our preliminary approved hardware thread. While we’re open to expanding our hardware support based on your requests, please note that not all requests may be granted.

On our blog, we cover a couple of platforms that can get you started right away (along with some open source examples):

Cylon.js

For wireless-enabled controllers, it’s hard to beat the speed and simplicity of a Node.js setup. Cylon.js takes it a step further with integrations for (deep breath) Arduino, Beaglebone, Intel Galileo and Edison, Raspberry Pi, Spark, and Tessel. But that’s just the tip of the iceberg, as there’s also support for various general purpose input/output devices (motors, relays, servos, makey buttons), and inter-integrated circuits.

On our Developer Gallery, you can find a Leap Motion + Arduino example that lets you turn LEDs on and off with just a few lines of code, plus an AR.Drone integration. Imagine what you could build!

Get started with Cylon.js

Vuo

Vuo is a visual scripting platform that features the ability to connect a wide variety of inputs and outputs. The latest version includes support for Serial I/O, which opens up access to a range of devices, including Arduino boards.

Get started with Vuo

The 3D Jam is fast approaching, so be sure to start your project early! We’ll keep you posted with the latest 3D Jam news and platform updates.

The post The Essential 3D Jam Development Guide appeared first on Leap Motion Blog.

Welcome to the 2015 3D Jam!

$
0
0

On your mark, get set, GO! This morning, our second annual 3D Jam kicks off with developers around the world competing for over $75,000 in cash and prizes – building brand new experiences for virtual reality, desktop, mobile, and beyond. Submissions are now open at itch.io/jam/leapmotion3djam.

With over 150 complete experiences submitted to last year’s 3D Jam, we saw everything from sci-fi space stations to the inner workings of the human body. Virtual reality experiences dominated the field, representing 14 of the top 20 and taking all three finalist spots. This year, developers have registered from over 80 countries around the world – twice the number from last year! We’ve also switched up the 2015 competition with two tracks: AR/VR and Open. The AR/VR track covers experiences built on tethered HMDs like the Oculus Rift, while the Open track covers desktop, hardware hacks, and the Internet of Things.

Over the next six weeks, developers will be racing the clock to get their projects on itch.io/jam/leapmotion3djam by November 9th at 11:59:59 pm PST (full contest rules here). Registrations will remain open until the submission deadline. If you haven’t already, we encourage competitors to register and get their hardware as early as possible. Everyone who registers gets a special discount code for our web store, and teams with complete submissions get refunds for the cost of their hardware.

prizes

Prizes

AR/VR Track

  • 1st Prize: $10,000, Unity Suite, 2 OSVR HDKs, NVIDIA GeForce GTX 980 Ti
  • 2nd Prize: $7,500, Unity Pro, OSVR HDK, NVIDIA GeForce GTX 980 Ti
  • 3rd Prize: $5,000, Unity Pro, OSVR HDK, NVIDIA GeForce GTX 980 Ti
  • 4th Prize: $2,500, Unity Pro, OSVR HDK
  • 5th Prize: $1,000, Unity Pro, OSVR HDK
  • Community Favorites (2): $500, Unity Pro

Open Track

  • 1st Prize: $10,000, Unity Suite, NVIDIA GeForce GTX 980 Ti
  • 2nd Prize: $7,500, Unity Pro, NVIDIA GeForce GTX 980 Ti
  • 3rd Prize: $5,000, Unity Pro, NVIDIA GeForce GTX 980 Ti
  • Community Favorite (1): $500, Unity Pro

logos-color

Development Resources

We’ve made some huge advances since the 2014 Jam, with new resources and integrations that will take your projects to the next level. (You can read our development guide for a full breakdown of our top resources and best practices.) Our Core Assets for Unity now feature:

On the Unreal side, we’re collaborating with the enormously talented getnamo to bring new assets to his community plugin. Right now, the plugin includes full support for Unreal 4.9.1 and VR, with Image Hands on the way. Stay tuned to our community forums for updates.

Hardware hackers also have access to more resources as the Internet of Things continues to grow. Integrations like Cylon.js and Vuo are making it easy for developers, designers, and artists to bridge the divide between people and technology in new and exciting ways. If you’re looking to submit a hardware project on the Open Track, be sure to check out our approved hardware list.

We can’t wait to see how you push the frontiers of technology with Leap Motion interaction. Touch base with your fellow jammers with the hashtag #3Djam, follow us @LeapMotion on Twitter and Facebook, or join the conversation on Reddit. Check out our community forum thread to find team members and get the latest updates. Good luck!

The post Welcome to the 2015 3D Jam! appeared first on Leap Motion Blog.

Infographic: Building Your 3D Jam VR Project


Changing How People Look at Physical Therapy

$
0
0

In the tech world, “making the world a better place” has become a bit of a cliché. But with over a billion people living with some form of disability or impairment, medical technology can make a huge difference in people’s everyday lives. That’s why Virtualware is using Leap Motion technology to help people recovering from strokes, Parkinson’s Disease, and more.

Put simply, VirtualRehab Hands is a mini-gaming platform that lets doctors monitor the progress of patients from anywhere in the world. The games are fun and simple, using Leap Motion’s highly responsive hand tracking technology to let patients control game elements on the screen. According to Virtualware, their system is the very first virtual rehabilitation software to be classified as a medical device, under the EU’s Medical Device Directives. It joins TedCas and MotionSavvy in bringing Leap Motion technology to the assistive healthcare space.

virtualrehab

The system has already been tested in installations in Europe, Latin America and the Middle East, according to David Fried, the company’s Director of International Business Development. Right now, it’s being used in the National Hospital for Neurology & Neurosurgery at Queen Square in London. Over the next few weeks, it’s slated to be installed in two more London hospitals – including one where it will be used for telerehabilitation (remote treatment) with stroke patients.

“We want to help make telerehabilitation a reality around the world. This involves a truly affordable technology solution that makes hand rehabilitation a more engaging experience for people of all ages in clinical settings as well as at home,” said David.

minigames

“One of the most interesting things that became evident when people first started using VirtualRehab Hands is the real demand for such solutions from the actual patients,” he continued. “People who suffer from neurological disorders and diseases are really motivated to get better, and are looking for new ways to do so, no matter what age they are.

“Leap Motion brings real independence for patients in the rehabilitation process – with its size and affordability, it allows us to provide a new method of telerehabilitation that can be used anywhere and anytime.”

IMG_4870

In the future, Virtualware plans to add more more therapeutic games for a variety of neurological and physical disorders. Each game is based on their work with neurologists and physical and occupational therapists. Beyond that, they also plan on expanding support to children with physical and developmental problems, and adding an assessment module for therapists.

Want to follow the progress of VirtualRehab Hands? Follow the creators on Twitter @virtualrehab_en! Patients and researchers can learn more by emailing the team at virtualrehab@virtualwaregroup.com.

The post Changing How People Look at Physical Therapy appeared first on Leap Motion Blog.

Escape Virtual Reality with Telekinetic Powers

$
0
0

Much like sketching the first few lines on a blank canvas, the earliest prototypes of a VR project is an exciting time for fun and experimentation. Concepts evolve, interactions are created and discarded, and the demo begins to take shape.

Competing with other 3D Jammers around the globe, Swedish game studio Pancake Storm has shared their #3DJam progress on Twitter, with some interesting twists and turns along the way. Pancake Storm started as a secondary school project for Samuel Andresen and Gabriel Löfqvist, who want to break into the world of VR development with their project, tentatively dubbed Wheel Smith and the Willchair.

In their first video, they begin by exploring a telekinesis-like manipulation mechanic, combined with a simple locomotion choice of a motorized wheelchair. (With any luck, it will turn out looking like one of these wheelchairs!) The interaction loop is fun and simple – look at an object and lift it with a telekinetic gesture, take aim, then push out to fire the object at the target.

Locomotion is one of the biggest challenges in VR development, with solutions ranging from omni-directional treadmills and Blink, to Superman-like flight in Leap Motion demos like Weightless. Pancake Storm’s demo is explicitly designed as a seated experience where your locomotion is controlled by leaning with the Oculus positional tracker – an approach that reinforces the user’s sense of body presence.

With the darker mood of the second video, we can see the seeds of a darker narrative that will drive the gameplay forward. Samuel and Gabriel found themselves thinking about a classic dungeon crawler combined with telekinetic powers and antagonistic AI. “When you put on the VR headset, you’re stuck in the game. We’re going to have a voice in the background, pretty much bullying you.”

You’ll also notice that this version includes Image Hands, now available in our Unity Core Assets for Oculus SDK 0.7. If you’re building with Unity, this is definitely the way to go.

In this latest video, the core concept comes more clearly into view. The lighting is less dark and moody, and now feels more like an exploratory puzzle game. As Pancake Storm keeps iterating on the project, we can’t wait to see how it evolves from here.

How is your 3D Jam project evolving? Share your progress on Twitter @LeapMotion with the hashtag #3DJam! Remember to post early demos and videos on our itch.io site ahead of the November 9th deadline for valuable community feedback.

The post Escape Virtual Reality with Telekinetic Powers appeared first on Leap Motion Blog.

Happy #ScreenshotSaturday! 3D Jam Mid-Progress Roundup

$
0
0

It’s #ScreenshotSaturday, and you know what that means – time to take a look at the very latest projects for the 3D Jam. We’re almost at the halfway mark, which means that developers are starting to share early glimpses of their builds. There are already three early entries on our itch.io site and we can’t wait to see what you have in store as the jam progresses. Here’s the latest and greatest from around the web:

The post Happy #ScreenshotSaturday! 3D Jam Mid-Progress Roundup appeared first on Leap Motion Blog.

Reach into the Digital World: Getting Started with Leap Motion @ HackingEDU

$
0
0

The world is changing – can you hack it? At Leap Motion, we believe that the next wave of technological interfaces will rely on the original human operating system: your hands. Whether you’re giving people the power to grab a skeleton, reaching into a human heart, or teaching anyone how to program, hands are powerful.

With HackingEDU just around the corner, Leap Motion is sponsoring the world’s largest education hackathon with over 100 Leap Motion Controllers for attendees to use. In the past, our community has built some incredible educational projects that bring a new level of interaction (and fun) to classroom activities. This is your time to hit the ground running and build an awesome project like:

Learning Earth Science through Gaming

Defend Against Zombies, Learn How to Code

LA Hack Finalists Armateur: Robotic Arm + Leap Motion + Bluetooth

RadioHacktive: Filling an Analog Drone with Digital Goodness

While you can find all of our platforms for artists, creative coders, designers, and more on our Platform Integrations & Libraries, this post is only going to cover some of the most popular hackathon platforms. After all, with just 36 hours to build, you need to ramp up fast!

Getting Started with Leap Motion

The Leap Motion Controller is a small USB sensor that tracks how you naturally move your hands, so you can reach into the world beyond the screen – in virtual reality, augmented reality, Mac or PC. The hardware itself is fairly simple, with three LEDs and two infrared cameras. It can track your hands to about two feet, converting the raw image feed into a rich array of tracking data. You even have access to the raw infrared camera feed, letting you create augmented reality experiences.

Once you have your controller plugged in and the Leap Motion SDK installed, you’re ready to begin. Our Unity, Unreal, and JavaScript integrations already include model hands that you can quickly drop into any project. But before you dig into development, here’s what you need to know about designing for the Leap Motion Controller.

Design 101: Escaping from Flatland

As a new way of interacting with technology, designing for motion control also involves new ways of thinking about interactions. Physics engines aren’t designed with hands in mind, and traditional UIs are built for 2D screens. Here are some tips that will help you build compelling experiences that feel natural:

Don’t settle for air pokes. Imagine how you would control your computer with your bare hands. Rather than simply using them in the place of a mouse or touchscreen, you can push, pull, and manipulate the digital world in three dimensions!

The sensor is always on. Motion control offers a lot of nuance and power, but unlike with mouse clicks or screen taps, your hand doesn’t have the ability to disappear at will. Avoid the “Midas touch” by including safe poses and zones to allow users to comfortably move their hands around without interacting.

Use easily tracked poses. Whenever possible, encourage users to keep their fingers splayed and hands perpendicular to the field of view. Grab, pinch, and pointing gestures tend to perform well, as long as they’re clearly visible to the controller.

For more tips, check out our Introduction to Motion Control, VR Best Practices Guidelines, and 4 Design Problems for VR Tracking (And How to Solve Them).

Building a 3D Desktop App with Unity

Unity is a powerful game engine that makes it easy to rapidly build and develop desktop and VR projects. Here’s a quick video that shows you how to make a VR demo from scratch in just four minutes:

You can also check out our Unity setup guide to see how you can start building.

Building a Web App

Want to build a web application? Leap Motion makes it easy with LeapJS, our JavaScript client library. Like Unity, it includes a rigged hand asset that lets you add an onscreen hand to your web app with just a few lines of code. To get started, check out these Hello World demos and learn how you can design VR experiences with WebVR.

Visual Programming for Artists and Musicians

Available on Mac, Vuo is a visual programming language that lets you easily prototype, mix, and mash up multimedia experiments. By using code like building blocks, artists and designers can quickly create amazing experiences that mash together visuals and sound. You can weave music from the air or create a physics simulation like this gravity mesh example:

vuo-gravity

Hardware Hacks

For hardware hackers, boards like Arduino and Raspberry Pi are the essential building blocks that let them mix and mash things together. And while these devices don’t have the processing power to run our core tracking software, there are many ways to bridge hand tracking input on your computer with robots, drones, and more. Check out this quick getting started tutorial for Cylon.js, which lets you connect just about any device you can imagine:

We can’t wait to see what you build at HackingEDU 2015! Tweet us @LeapMotion #hackingEDU and share your projects.

The post Reach into the Digital World: Getting Started with Leap Motion @ HackingEDU appeared first on Leap Motion Blog.

Tectonic Shift: Why Education is About to Change Forever

$
0
0

At its most powerful, education harnesses our natural curiosity as human beings to understand the universe and everything in it. This week on the blog, we’re exploring what it means to actually reach into knowledge – and why developers are at the forefront of how the next generation is learning about the world they live in.

Seeing a geological diagram in a textbook is one thing. But reaching out and creating massive volcanoes with your bare hands? Rearranging the continents by searching for hidden fossil patterns? Now you’ve got some magic in the classroom.

Educational gaming is on the verge of a major turning point, and one of the leading forces is Gamedesk – an LA-based research institute, commercial development studio, online community platform, and physical school.

Recently, Gamedesk released a lengthy white paper detailing how they built a set of “kinesthetic learning” games that teachers can use to teach complicated geoscience concepts to students aged 12 to 15. These include Leap Motion games GeoMoto and Pangean, which let you rearrange continents, shift tectonic plates, and form volcanoes. Pangean and Geomoto are both available for free download on Gamedesk’s website and on our Developer Gallery.

Pangean

Formerly known as Continental Drift, this puzzle game introduces the essentials of continental drift before moving on to plate tectonics. As a galactic member of the United Colonies, you travel the universe in your own scouting ship – using your hologram interface to piece together continents and demonstrate the shift that occurs over a hundred million years.

Use the fossil probe to reveal patterns in creature inhabitance and the sonar to scan for eroded portions of the continent. Your final mission? Returning present-day Earth to its Pangaea state! To help students absorb the lesson, teachers can ask: Why do you think the continents can be connected with each other? How did you use fossil remains to help you connect continents up? And why do you think similar fossils are found in different continents now?

GeoScience_Continental_Drift-1

GeoScience_Continental_Drift-2

GeoMoto

Building on their insights from the other three games in the series, GeoMoto (formerly Plate Tectonics) gives players a more direct relationship to geo-concepts. In other words, pulling, smashing, and grinding tectonic plates together!

Using the Leap Motion Controller, players navigate around a world with no geographic features, then shift and experience the motion of the plates with hand movements. You can see how plate tectonics create volcanoes, folded mountains, rift valleys, and seafloor spreading, then learn about different types of faults and the Richter scale.

GeoScience_Plate_tectonics-14

Photo_6

Kinesthetic Learning and the Future of Education

Geoscience is a complicated subject that involves thinking about the Earth as a fluid and complex system that’s constantly changing. These can be difficult concepts for kids, so Gamedesk used a kinesthetic learning approach to shed new light on the subject. This is a learning style that lets students engage physically with complex subjects through movement and action, rather than just watching a video.

Along with the creative and educational possibilities of virtual reality, we’re excited to see where motion-controlled gaming will take the next generation of students. You can download Pangean and Geomoto from Gamedesk’s website. Be sure to check out their white paper to learn about how the games were researched, built, and tested – including lesson plans and resources for teachers!

Plate_Tectonics_PlayMaker_HiRes-2399

The post Tectonic Shift: Why Education is About to Change Forever appeared first on Leap Motion Blog.

Viewing all 481 articles
Browse latest View live