Welcome to the site of Kearwood Gilbert
The first production powered by my open source Kraken Engine has been released! There are two parts to Circa 1948:
An installation art piece consists of a 4-projector, 360 degree C.A.V.E system, that participants can enter to be surrounded by the screens and tracked by Kinect sensors. The participant can navigate through Hogan's Alley and the Hotel Vancouver Circa 1948 and eavesdrop on the voices of the past to reveal the story in a non-linear fashion.
Hierarchical LOD is now working! This will allow you to create huge, detailed worlds with distant objects in view.
In Kraken, Hierarchical LOD works by breaking up the scene into pieces that are of multiple levels of detail. Only the pieces closest to you will be displayed using the highest detail. Unlike the simple LOD meshes that Kraken already supports, Hierarchical LOD works on entire branches of the scene graph, turning them on and off.
Apple's OpenAL reverb and 3d "headphone quality" spatialization has been broken in IOS since IOS 6.0, resulting in issues such as no output or only output from one speaker. To address this, Kraken will be gaining its own low-latency audio engine with HRTF-based 3D spatialization and convolution reverb.
Convolution reverb with a 2+ second stereo impulse response sample is now working in real-time on an iPad 2! I'm making extensive use of the FFT functions in the Accelerate framework for iOS and OSX.
Kraken Engine now supports animation, bones / rigged character import, volumetric lighting, environment effects, particles, and a ray+line casting system (to support physics). First commercial application to use it will be released Spring 2013!
Kraken Engine source code is available at https://svn.kearwood.com/
A new demo scene and app container is in the works to show off the new features...
Kraken Engine Source code released!
My "from scratch" game engine for iOS, named "Kraken Engine" or "KREngine" has a while to go before it is a mature, stable product; however, it may already be of use to others in its whole or in parts. For the benefit of the open source community, I am releasing the source code and opening up my Subversion repository to the public. Please be advised that there may still be large sweeping changes, especially in its API and scene graph objects.
I have posted a project page with the mission statement, design goals, road map, and collection of screenshots. Please visit the Kraken Engine Project page for detail on the engine and how to download source code.
In photography and cinema, lighting is essential in conveying the emotion and atmosphere in a scene. Often this requires the use of many lights at once, which is computationally expensive when using a forward rendering approach. In order to support many lights of varying types (points, directional) displayed at once, we can de-couple the lighting calculations from the rest of the rendering calculations by using the deferred lighting approach.
Please click "read more" for more screenshots and technical details...
I am continuing work on my custom, from-scratch, game engine for iPhone, iPod, and iPad. It is a great challenge to implement the various shader paths and allow the engine to scale up and down to accomodate variations in CPU, GPU, memory limitations, and scene complexity.
The iPad 2's A5 chip could be used for a real competitor to the XBOX 360 and PS3. It has proven to be capable of rendering PSSM (Parallel Split Shadow Mapping) and DOF (Depth of Field) in real-time. It makes me wonder if an A5 based Apple TV will have official support for Bluetooth gaming peripherals.
For a some more teaser screenshots of my rendering engine, please click "read more" below...
On April 3rd, 2011 the vancouver community joined together to assist the people in Japan who have lost their homes, family members, friends and lives from the devastating earthquake and tsunami. The participants raised over $5700 through the Canadian Red
I capture video in HDV format, which uses compressed audio. When importing these clips into Adobe Premiere Pro CS5, the video gets "indexes" and the audio gets "conformed". Conforming audio essentially involves decompressing it and stretching it out to 32-bit samples that are more efficient for non-linear editing. These 32-bit samples are stored in Adobe's proprietary CFA files that are either stored with your source clips or in a subdirectory of your home directory.