Thursday, February 28, 2013

Week 8 wrap up. (demos at the bottom)

Week 8, the last stage of the development (for this class). Usually around this time (end of a development cycle), a developer would say "the programming was a success and you can now download a demo of the game at (x) location"...this is not the case.

The current stage of the development doesn't have a public playable demo and barely a presentation demo. This was primary due to my lack of knowledge about the engines being used. If more research was put into how well the engines could process a lot of points, then I wouldn't be at this point right now.


= Setbacks, the engine =

When I started the project, I was very comfortable with the jPCT-AE engine due to it being recently used on my previous project. I noticed a problem with processing a lot of objects, but they also had a lot of vertices in each object. Since this project would only have minimal display for each object, a colored square, I thought this problem wouldn't exist....not the case.

By time I was able to finish the base code for the program and made a presentable program, the best I was getting out of the engine with 200 objects was 14 fps (out of 60 fps possible). This was very pathetic since more graphically intense games had higher fps than this.

After some research, I found the Rajawali engine and it boosted of being able to process 2,000 objects each with it's own location/ rotation/ texture in a fluid running environment. This was allowed by the engine storing all the objects in a single object and interacting with the vertices for each "object" being shown. This was the exact fix I needed to use in my program to pull the game out of the <20 fps ditch it was falling into.


= Overcoming the setback (part 1) =

So, I hunted for code in the jPCT engine which would allow me to interact with the vertex points of a single object. The best thing I found was a class called the "GenericVertexController", which was essentially a handle to the object. Every OnDraw call, I would call the apply function of this controller, which would apply a pre-defined set of commands that would adjust the vertices.

Research was then put into how to effectively use this controller. I ended up making a loop in the apply function that would go through all the particle objects (objects that now only hold information about the particle) and apply their data (position) to the vertices....nothing really happened after the apply was called. But, that's what I get for using incomplete code which was found in forums.

There was another equally as large problem too. Since all the particles were contained inside the single object, I could no longer use the convenient "setColor(int r, int g, int b)" and "setAlpha(float a)" functions that the object presented. Because by using them, it would change the alpha and color of every particle. Ok if all the particles are meant to be the same color/ transparency, but they are not supposed to be in this case.

So, after talking to the developer, he said I would need to use a vertex shader to change the colors and I could just sent an array of colors to the shader that would be applied to the objects....but there engine currently did not have this feature because of some problem which the developer was unsure of. Not the most re-assuring thing to hear.


= Overcoming the setback (part 2) =

Since jPCT-AE wasn't able to give me more power/ functionally, it was decided to switch out the jPCT engine for Rajawali. This new engine out front showed the ability to interact with vertex points, vertex colors, normals, and texture coordinates in a casual open way. But to use the engine, I had to replace many functions in the game. I also had to re-write how the game ran due to required dependencies to make the engine run.

After about 3 days of configuring objects to be compatible with the new engine while also having convenient functions which used in the old engine, I was finally able to test the game.....well, nearly.


= Shaders, it's the new (graphical) view =

To be able to display the particles with this new engine, I would need to write my own graphics shaders to properly color the vertex points. Good thing DeVry was keeping up with technology and taught us how to write shaders.....HA, no. The technology for shaders came out in 2004 and DeVry still didn't have any classes about it, but apparently there are rumors of a class being introduced in mid 2013 to teach how to use shaders (which would be after my graduation).

So, I was up on my own to learn how to work this mysterious shader magic. After searching around on the web, I found a tutorial by NeHe which gave everything possibly needed about what the shaders do and how to use them. So I spend a few hours reading the site (there was many pages), and resulted with a nicely optimized vertex and fragment shader. I was now ready to test out my game and behold the flying particles.


 = Color? We don't need that stuff =

Switch out an engine for a new one, learn the language of the gpu, re-configure/ re-arrange the game to support the new particle hierarchy, and expect there to be no problems? Something had to give, and in this case, it was color. I was beating my head against this problem for nearly a week. I did everything that the other tutorials did.
Create a object and have it extend Object3D, call the constructor while creating my new object, setup the vertex information, pass this data to the parent object so it can be prepared for rendering, create a material, add the shaders to this material, start rendering.....and yet my object was black.
After looking through hundreds of lines of the engine's code (because there was no documentation for the engine), I found that a light source was required (not actually used, just required) for the color data to be sent to the shader.

Take 2, and still no color. After more research and code reading, I thought "what the hell, lets see if my shaders even work", so in the fragment shader I told it to color every fragment red....and it worked. I then told my vertex shader (on the last line) to do the same thing and the fragment shader was adjusted to use the vertex shader color....and it worked (quite the head scratch, isn't it?).

Turns out my fog code was being run and was changing the color to black. Which was strange because it was nested inside a if-case that needed the variable "fogEnabled" to be true for the code to run. So more hunting and I added some "#ifdef FOG_ENABLED" code which was used in the fog demo and re-enforced by the shader program compiler in the engine. My program finally had color.


= Nothing better to fix a problem than another problem =

And just when I thought "Hey, I might actually get the game out on time", the program throws another laugh at me. The problem I had in the first few weeks, the conversion of touching the screen to make a point in the virtual world, was back. And to add more humor, the world scaling was off and I had no easy access to the jPCT implementation of this 2D->3D conversion (well, I "acquired" some source code, but it was very heavily tied in with the rest of the engine. Too much for an easy copy/ paste).
The code which Rajawali had for the 2D-3D conversion was all weird. It either game values so small that they hovered around the 0,0,0 point or some point which wasn't even within range of the virtual world.
I looked online for how to implement the function...and this is where the project is at the moment. The game screen has 2 points (one red and one green, to for testing that show the bounds of the screen...and for some reason, they are not going away despite their data being over-written) and another point which flickers between multiple colors (I set the life of the particle very low and made it change colors when being revived). The game is not appealing to play.
Although the code is very sexy and well organized, it means nothing without the 2D->3D conversion to tell where the particles will be emitted from. Sure, there is the problem of the world coordinates being smaller than the jPCT engine, but that's nothing compared to the lack of a conversion function.


= Class specific questions =

1. Did the project teach you about emerging technologies?
- Yes and no. Yes, in the fact that I finally got my hands on learning how to use shaders (thank you again DeVry for being inline with the new technology). But no because this technology came out 9 years ago and no longer is "emerging technology". Sure, this was also my first use at implementing gravimeter (gravity + acceleration) data in a program, but this is also old technology. The only thing, somewhat new, that was "emerging technology" that was used was that I was coding a project that was compatible with Android 4.0.3 which came out early 2012....damn, I thought it was sooner than that. Correction, I didn't learn anything about emerging technologies.

2. Did it help you manage your time in relation to a project involving emerging technologies?
- I don't know what the "it" is referring to, but this project was the first time I used a time sheet to record my programming times and what I did during each session. Seems that my average coding session lasted 4.5 hours (as shown in the first picture. I stop coding to eat, then resume coding). Comparing this to another project I was working on at the same time (in flash AS2), the visibility of viewable features compared to how much has been coded starts appearing around the 45-50 hour mark.

Majority of the coding was done between midnight and 10 am

Weekend coding marathons?

3. Did it help you decide how you might apply your skills in games and simulation programming to the world of ideas and technologies discussed in the class?
- To be blunt, I would have gotten an extra 40% father on this program and have a public alpha if I wasn't taking this class. Although I read the lectures, read the discussion threads, joined in them a bit, and wrote papers about emerging technologies (well, more like how we got to that tech, the show Fringe showed more emerging technologies than the class)...the class's info about emerging technologies didn't help at all with this program or how to apply my skills.
- What did help determine where to apply my skills based on knowledge which I have acquired during the weeks I was in this class is that I should really invest more time in writing up my own graphics engine. The current engines out there have too much padding between the developer and data which limits performance (jPCT-AE), are too obnoxious to be used (I'm talking about you jMonkey. 10 Mb for a mobile engine, what the hell?!?), or have so many hoops to jump through to make the engine work that it becomes a pain (Rajawali). When I make my engine, I'd make it easy to change data, convenient functions for high-level stuff, and preserve access to performance gaining stuff. If I pass this class, I would finally be done with school and I'll have time to work on this engine.


= Project wrap-up =
 
Yes, the project ended on a sad note, there is no working demo. The demo I do have is riddled with performance issues (I archived the jPCT project code and made a new project for the Rajawali code) and the current demo is nothing more than a black screen with a flicking dot.
Although this class is over, I do plan to continue this project and make a publicly playable game which can be downloaded from the android market. Last I checked, there was no particle-based games which had particles flying around in a 3D environment. It's not like that feat is impossible, it just requires more work than a 2D one would have.

To finish what I have planned for this project, a new graphics engine would need to be created. One which allows easy creation of objects, easy alteration of the object's geometry/ color, and provides as much performance as the OpenGL code allows. I don't know a lot about graphics engines and my previous attempt to make one failed (thank you DeVry for knowing how to learn me well). But after leaving DeVry and not having to worry about online discussion posting requirements or weekly paper deadlines, I would be able to focus on the engine development with a clear mind and get more games rolling out and pushed to the mobile market (to start making some money, because loans are f-ing heavy).


= Here be dragons =

For those who are interested in testing in what I do have, here are the links to the apk files. If you don;t know how to install an apk file on your phone, ask google.
The first attempt using jPCT. Up to 8 fingers can be used, each one creating particles which interact with the orientation of the phone. Each added finger/ particle emitter WILL slow down the phone. If you add 8 fingers, expect the program to run very slowly

The second attempt, using Rajawali. Adding multiple fingers still works (works up to 10 now if your devices supports it) and it won't give your phone a heart attack like the above link....but the only thing which is visible is some flickering dots which hover around the center of the screen. There is also 1 red and 1 green dot, I'm still not sure why they won't leave (Yes, I put them there, but their data is over-written, thus they should leave...but don't).


= Below this heading is intentionally left blank =












 

Friday, February 22, 2013

Is this the plateau or just another step?

...I guess all good things have to come to an end eventually. My "1 line of code" in the fragment shader was wrong....I had to add 1 more line which stated the precision of the float's precision point.

So now my fragment shader says "precision mediump float;" at the first line....but technically, this isn't "running code" because it's outside of the "void main()" loop...so, my fragment shader still has only 1 line of code and is still valid :-)


A new kind of "point"
After checking out the GLSL tutorials a bit more, I found a better way to draw the particles.
Instead of using the default GL_TRIANGLES, which draws a independent triangle for every 3 vertices and I'd need 2 triangles to make a particle; I could use GL_POINTS instead. This makes every vertex point a particle (square shape by default). The only difference by using this was that I had to change...a good large section of my vertex shader code


new point = new shader
uniform mat4 uMVPMatrix;// model view matrix
uniform vec4 uCamPos;// where the camera is (x,y,z)
uniform float uPointSize// base of how big to draw the particle

uniform vec4 ambiantColor;// "air" color applied to everything
uniform bool fogEnabled;
uniform float fogStart;
uniform float fogEnd;
uniform vec3 fogColor;// color applied after fog start

attribute vec4 aPosition;// vertex x,y,z
attribute vec4 aColor;// rgba

varying vec4 vertexColor;// color of the vertex point

const vec4 WHITE = vec4(1,1,1,1);

void main()
{
    // normalize the colors and alpha
    vec4 nColor = aColor;
    if(nColor.x > 255.0)// red
        nColor.x /= 255.0;
    if(nColor.y > 255.0)// green
        nColor.y /= 255.0;
    if(nColor.z > 255.0)// blue
        nColor.z /= 255.0;
    if(nColor.w > 255.0)// alpha
        nColor.w /= 255.0;
       
    // make the color
    vertexColor = vec4(min(WHITE, ambiantColor).xyz, nColor.w);
    vertexColor.xyz *= nColor.xyz;
   
    // make fog
    if (fogEnabled)
    {
        vec4 vertexPos = uCamPos * aPosition;
       
        float fogWeight;// how heavy the fog color will be applied
        vec3 fogVertexColor;// color from the fog
        fogWeight = clamp((-vertexPos.z - fogStart) / (fogEnd - fogStart), 0.0, 1.0);
        fogVertexColor = fogColor * fogWeight;
       
        // apply the fog
        if (fogWeight > -0.9)
            vertexColor.xyz = (1.0-fogWeight) * vertexColor.xyz + fogVertexColor;
    }
   
    gl_Position = uMVPMatrix * aPosition;
   
    vec3 cp = vec3(uCamPos);
    cp.x *= -1.0;
    float pdist = length(cp - aPosition.xyz);// ifgures out how far the particle is from the camera
    // then adkjusts the vertex's drawn size
    gl_PointSize = uPointSize / sqrt(3.0 * pow(pdist, 3.0));
}
In case you aren't able to read GLSL code, just check out the comment line. The biggest change, besides a re-ordering of the lines, was the inclusion of "gl_PointSize". This is a OpenGL command which seems to be used with GL_POINTS is used that defines how large to draw the vertex...I don't know much besides that.


COMPLETED UPDATES

I was finally able to update all my code from jPCT-AE to Rajawali. This includes a complete re-modeling of how the particles are created. When the user touches down on the screen, it creates a ParticleSystem (only if one hasn't been made for this point already).

When the ParticleSystem is created, it makes an array of ParticleBuckets, these hold the particle geometry and data about each particle. It also controls the size of the geometry and data array sizes, calls update on all the particles, is the only thing connecting particle data to the data which gets sent to the renderer....why this important system is called "ParticleBucket", I still am not sure. It just sounded good at the time, and I haven't found a better name yet.

When the particle bucket is created, it makes a ParticleContainer (holder of the particle geometry data) and an array of Particles (each one containing information about 1 particle. it's position, velocity, color, and life).
To ease the speed of making particles, each Particle starts with blank data and requires a call to Create() to initialize the particle's data.
ParticleContainer has an array of vertices/ normals/ colors/ indices which are at length to the maximum each ParticleBucket can hold. If the ParticleContainer can only hold a max of 200 particles and 150 are requested to be made, then container will still allocate space for the 200 particles.


And then the problem hits

After all the code updates and shader coding...now I've come to a point where the program always seems to crash when going through. While creating the ParticleContainer, after creating the initial data, setData() is called in it's base class (BaseObject3D) which creates buffer arrays that are passed to the graphics card.
Every time now, it crashes on the same line, "BufferUtil.copy(vertices, mVertices, vertices.length, 0);", which is a call to a class in Rajawali that copies the data (stored in vertices) to a floatbuffer (mVertices).
I even tried to use data which was in one of the Rajawali tutorials, and it still crashes.

So, I wrote up a report of what my program does and sent it off to the Rajawali developer. Also included in the email was the source code for my program. Until I get a reply back from him or I figure out this problem, the program is at a stalemate. I could go back to jPCT, since I have a backup, but I would always be getting low performance issues.

Tuesday, February 19, 2013

Shader knowledge (week 6 wrap up)

After talking with the jPCT-AE (game engine) developer about the speed problems and asking for advice (http://www.jpct.net/forum2/index.php/topic,3219.0.html)....it seems that I've hit the limit that this engine can perform at. The worst part, the game is only doing less than 10% of what it was made to do.

So, for the last few days (most of the time was working on other programming work), I've been converting all my code from jPCT-AE to Rajawali.
The new engine would let me: create objects and interact directly with the vertex points, vertex indices, vertex attributes, and input custom shader data/functions (which I'm still not sure how to do with jPCT).


The Exchange
At first when converting my code to the new engine, I decided to use the source code and just copy what was needed and skip the rest (Rajawali provides up-to-date source along with a compiled library). The idea behind this was that I would have a smaller file size and more control over everything that was happening.
...I ended up spending 4 hours JUST on importing files, renaming things to work with my program, and trying to figure out "do I need this section of code, or can I cut it out?". Which just ended up becoming a tiring process and a waste of good programming time.


The NEW Exchange
After a night's worth of rest, I came to the realization "just use the damn library. It comes in a convenient package and if tweaking is necessary, you can make a new engine when not under a heavy programming work load"....So, I ripped out the jPCT engine and installed Rajawali (which was the easy part)....configuring my code to use the new engine, not quite so easy.

The way jPCT works:
= Setting up the game
1. Make an Activity
2. Make your main game code
3. Make a Renderer and pass in the main game code (so the onDrawFrame code will know what to pass a reference to the frame buffer too)
4. Make a Touch Surface and pass in the main game code and renderer. The game code is so it will know where to send the touch commands to, renderer is so the app can set up the OpenGL enviroment and set the phone's renderer to your renderer.
5. Set the context view to your Touch Surface "setContentView(touchSurface);". This tells the Android system bring touchSurface to view and start running the code.

= Running the game
1. The renderer in
    public void onDrawFrame(GL10 gl)
    {
        if(paused)
        return;
       
        fb.clear(back);
       
        screen.Update();
        screen.Render(fb);
       
        fb.display();
    }
calls to your game code to update and render, and you game does as it's told. The frame buffer is not interacted with by you, but is passed to your game code's world which has all your objects/ lights contained inside.
2. Your touch surface receives input and sends it to your game code. My first game used a queue system, now it uses as static reference system (seems more efficient).


The way Rajawali works:
= Setting up the game
1. Make an activity but have it extend RajawaliActivity instead of Activity (Android native)
2. Make a renderer and have it extend RajawaliRenderer instead of Renderer(Android native)
3. Make your main game code and give it reference to your renderer (so it can interact with the renderer)
4. Tell the renderer about the main game code (for it's onDrawFrame code which says when to update and render)

...and that's it (for the set up). The rest is covered by Rajawali....which isn't the most comforting...BUT, it does make up for it when running the game.

= Running the game
1. (...skip the boring "setting up the world view") CREATE OBJECTS...well, currently it still looks like a copy/paste of the code as seen here (and that's because it is), but that's because I've been working on something more important to game, GLSL Shaders. ;-)


Making Shaders

To begin with, and put bluntly, Holy Crap this stuff is confusing. And I blame the confusion on how "things" are processed. In conventional programming, there is a loop which cycles through all of the objects and there is a defined variable which says "this is the ##th object, lets do something with it". In shader code, which took me so long to understand and I didn't fully understand it till reading NeHe's tutorial, is that the "objects" are ALL processed at the same-ish time.

When passing the data to the shader, they are sent in buffers that contains: how much information there is, the type of info, the info itself, and how many items are in each "group" (although I may be confusing standard usage with VBOs on this item). Anyway, the data is all passed to the shader, it cuts all the data into chunks, and each gets independently processed in the vertex shader.

The end result of the vertex shader is a large pile of items containing end result result info about the pointed processed. Specifically, where is the point (saved to gl_position) and any other information which needs to be known about this point (like the end result color.
ALL the fancy lighting effects which are done on the object (including your 10+ light sources, light source color, material glossyness, diffuse and specular properties), the fog amount, fog color, ambient world color, per-vertex color, and alpha amount....all compressed down to an ARGB value (known in shader code as a vec4, a float array with 4 objects).

The vertex shader data then gets sent to the fragment shader which is.... kind of abstract. The vertex shader processes each vertex (x/y/z) point, while the fragment shader processes chunks of data. It takes a pixel "estimated?" and shoots a line from the screen into your world. Every vertex it hits which has direct interact with the line gets added to the fragment color. So in this way, multiple vertex points become 1 "point". Which explains how the machine is able to presses so much data and display it on a small screen.


The process of me learning the shaders, un-doing confusion (set by how shaders work), and making my own shaders (yay)...ended up dating a good 5-6 hours. Yea...it's that rough. Would have been nice if there was someone I could talk to to help me understand it faster/better. But Devry isn't exactly in the business of teaching and I'm the only person I know who is this deep in programming....or even programs for the matter (pretty sad).....


MY SHADERS!!

I haven't yet tested them, so there is a good chance they are not fully correct (I'm mostly worried about the fragment shader), but I am majorityly positive that the vertex shader is correct. My only confusion is how to send the alpha values for each vertex...but this is because I haven't done anything with the game code yet.

Vertex Shader:
uniform mat4 uMVPMatrix;// model view matrix
uniform vec4 uCamPos;

uniform vec4 ambiantColor;// "air" color applied to everything
uniform float fogStart;
uniform float fogEnd;
uniform vec3 fogColor;// color applied after fog start

attribute vec4 aPosition;// vertex x,y,z
attribute vec4 aPlanePosition;// triangle x,y,z
attribute vec4 aColor;// rgb
attribute float alpha;// alpha value

varying vec4 vertexColor;// color of the vertex point

const vec4 WHITE = vec4(1,1,1,1);

void main()
{
    vec4 vertexPos = uCamPos * (aPosition + aPlanePosition);
   
    vec3 fogWeight;// how heavy the fog color will be applied
    vec3 fogVertexColor;// color from the fog
   
    // make fog
    if (fogStart != -1.0) {
        fogWeight = clamp((-vertexPos.z - fogStart) / (fogEnd - fogStart), 0.0, 1.0);
        fogVertexColor = fogColor * fogWeight;
    } else {
        fogWeight = -1.0;
    }
   
    // make the color
    vertexColor = vec4(min(WHITE, ambiantColor).xyz, alpha);
    vertexColor *= aColor;
   
    // apply the fog
    if (fogWeight>-0.9) {
        vertexColor.xyz = (1.0-fogWeight) * vertexColor.xyz + fogVertexColor;
    }
   
    gl_Position = uMVPMatrix * (aPosition + aPlanePosition)
}
Fragment Shader:
varying vec4 vertexColor;

void main()
{
    gl_FragColor = vertexColor;
}

.....and now that that mess is solved (been confused about shaders since mid-development of my CCMaze project), I get to jump on some java code and make some dynamic particles. If everything goes as planned, I could pump 10K particles into the world, each with their own color and alpha value, and the phone would run them all without breaking a sweat.

Monday, February 11, 2013

Flying particles (week 5)


Vector system (2D vs 3D)
After another realization of how primitive my particle system was (it was build with 3D particles in a 2D system, thank you time for making me forget there was a difference between the two), I took out all the 2D vectors and replaced them with 3D vectors. Now it can truely move around, as opposed to just a 2D plane.

G-Data problems
But even then, moving the phone around did nothing despite the gravatometer (tilt + accelerometer) showing that it was getting values. Turns out that although they were being send to the class, it's sub-class was getting the value and not telling it's parent about the value (due to me using "@Override" and forgetting about the "super(base)"). This was patched up and values were flowing through the system again.

First Images
Meanwhile, this is all stepping through the debugging mode, I still haven't seen anything of the particles...due to there not being anything to show yet. But, after making the basic particle (4 points, color, billboard = true, add to the world), they came without a problem....well, short of the problem where the particles are so close to the screen that only 1 is visible at a time. This was solved by re-telling the particles to be at z = 20 instead of the default z = 0 (my fault for forgetting it).

At this point, I wish I took a screenshot to show what was happening, but I forgot. The screen which was now visible, the FIRST image of the particle system.....was 100 dots randomly flickering on the screen. Not moving, just flickering. Seems that they were moving so fast, they were only visible once and then too far away from the viewing port. So I adjusted the velocity values before using them (divided them all by 1000), and now something was working.

They were no longer flying away...but now wimpering around. Yes, wimpering....not floating towards gravity, just gittering around the emitter location. Which, by the way, was working. Looked pretty nice with a particle sprinkle effect which follows your finger (another screenshot which I failed to take).

This, wimpering, was caused by normalizing the gravity values as soon as they were received...which by the way, Android has record of the gravity value for the Death Star (the same one from Star Wars). Thank you funny Android developers :-). So, I took out my code of normalizing the data, and particles were now flowing

First particle system (making a stream)
May not seem like much, but this is a fully working particle system producing a stream of particles. The emitter is in the middle where the light blue squares are and falling away away (up right).
There was a slight problem where the x axis was reveresed, but this was resolved by a simple "change + to a -".

4 Streams, moving into space
 I then tried more fingers, which is shown to the left with particles going away from the user (the phone was sitting on the table).
Besides the system running at 14 fps (more on this later), this was great. Particles spawned, moved, and each had their own color













4 streams, moving towards the user


Then, the much anticipated view....particles falling towards the user.As expected, the particles got bigger the closer they got. Although not as drastic or amazing as I hoped for, this still showed that they are working

Note: please do not change the streams to yellow when the game is further in development.


Speed problems
As mentioned earlier in this post, 4 streams were causing the game to run at 14 fps. For a game which is supposed to handle up to 8 on a phone (12 for a tablet) and currently there is only 100 particles per emitter....this was really shitty performance.
 When compared to the particle system Rajawali has, his absolutely kicks my system's ass. But, I know why. My system has 100 objects per emitter with 2 triangles per object (to make a square). His system has 2000 planes (4,000 triangles)....in 1 object.

The way the graphics system works is that it likes to do as many things at once. Having to open a memory channel (to pull data) and then close the channel (so another can be opened) takes time (about 1/2 ms?). Sure, it's not that long for a few objects, but when there is many objects all needing seperate memory channels being opened, this adds up.

So, what I am going to be doing after this post is, instead of having 100 objects, there will be 10 objects with 10 particles each (20 triangles per object)....or 1 object with 100 particles (which would help when there is more than 100 total particles per emitter).

But, copying his system isn't so easy. In the mentioned page, the system accesses the GL buffers while mine (jPCT-AE) takes care of the GL access behind the scenes....So I need to find how to access the data using some built in function. Which, after a bit of looking, seems that the Polygon Manager would do the trick. No research has been done besides "does this interact with the vectors?", but so far, it seems to say yes and that this is the function I'm looking for.

Future plans
- research how to use the Polygon Manager
- add more eye candy to the particles
- start researching how to add a tabbed menu system for the settings


 Nearly forgot; Custom Squares
Sometime around when working with the single particle system, there was some speed glitches. Not much, but enough to be noticeable. So, I checked on the jPCT-AE forums for what others have done. and it seems that running a function to make the squares was faster.

My code:
public Particle(ParticleSystem particles)
{
    object = new Object3D(Primitives.getPlane(1, 1));
    object.setBillboarding(true);
    object.translate(Particles.emitter);
    object.Build();
    object.Strip();
    object.setScale(size);
    object.setAdditionalColor(color);
}

Code from online:
public static Object3D createQuad(float width)
{
    float offset = width / 2.0f;
    Object3D obj = new Object3D( 2 );
   
    obj.addTriangle( new SimpleVector( -offset, -offset, 0 ), 0, 0,
    new SimpleVector( -offset, offset, 0 ), 0, 1,
    new SimpleVector( offset, offset, 0 ), 1, 1);
   
    obj.addTriangle( new SimpleVector( offset, offset, 0 ), 1, 1,
    new SimpleVector( offset, -offset, 0 ), 1, 0,
    new SimpleVector( -offset, -offset, 0 ), 0, 0);
   
    // Make it billboard:
    obj.setBillboarding( Object3D.BILLBOARDING_ENABLED );
    // Set up the transparency:
    obj.setTransparency( 50 );
    obj.setTransparencyMode( Object3D.TRANSPARENCY_MODE_ADD );
    obj.setLighting( Object3D.LIGHTING_NO_LIGHTS );
    obj.build();
   
    return obj;
}
My new code (+ the online above code)
public Particle(ParticleSystem particles)
{
    Position = new SimpleVector();
    color = particles.color;
    size = particles.size;
   
    object = createQuad(1);
    object.setScale(size);
    object.setAdditionalColor(color);
}
This was able to add some boost....although it makes me curious of how the objects were created in the "Primitives.getPlane(1, 1)" function. Anyway, this may be changed/ removed after applying the Rajawali optimization changes.

Edit:
The code which jPCT-AE uses to make the planes with "getPlane(int quads, float scale)" is that far from the code which I'm already using. The code I'm using has less padding and only makes a 2-triangle square

public static Object3D getPlane(int quads, float scale)
{
    float startx = -scale * quads / 2.0F;
    float starty = startx;
    float tx = 0.0F;
    float ty = 0.0F;
    float dtex = 1.0F / quads;
    Object3D obj = new Object3D(quads * quads * 2 + 8);
    for (int i = 0; i < quads; i++) {
        for (int p = 0; p < quads; p++) {
            float dtx = tx + dtex;
            float dty = ty + dtex;
            if (dtx > 1.0F) {
                dtx = 1.0F;
            }
            if (dty > 1.0F) {
                dty = 1.0F;
            }
            obj.addTriangle(
            startx, starty, 0.0F, tx, ty,
            startx, starty + scale, 0.0F, tx, dty,
            startx + scale, starty, 0.0F, dtx, ty);
            obj.addTriangle(
            startx, starty + scale, 0.0F, tx, dty,
            startx + scale, starty + scale, 0.0F, dtx, dty,
            startx + scale, starty, 0.0F, dtx, ty);
            startx += scale;
            tx += dtex;
        }
        starty += scale;
        startx = -scale * quads / 2.0F;
        tx = 0.0F;
        ty += dtex;
    }
    return obj;
}

Tuesday, February 5, 2013

Finally some finger friction (week 4 progress)


 New touch system
Turns out the system I was using for touch, the same one which worked flawless for my previous project, was impossible (for the most part) to use in this project.
What the system was, was a queue-based one. Each touch triggered a touch event (normal action), the event was then translated into a touch object (my code), and added to the back of a queue (also my code). The problem with this is that multiple users were not being detected. My previous post about the pointer ID's was correct in how they work...but when removing a touch, the system didn't recognize it. Then there was crashes about syncing, which was just beyond my logic.

The new touch system sends the x/y location (in screen coordinates) straight to static variables (always visible everywhere in the program), and they are all processed each update phase of the program (regardless if there was a new touch event for that finger or not). Although this process prevents a full stream of data from x1/y1 to x2/y2, it does represent the most up to date point location and prevents a flood of old data from filling the screen with particle emitters.

Currently, all the touch points are visible, no delay is shown (besides the render update delay), and the system recognizes when a touch is active or not.

My fingers, all visible and colory


Dead/alive/dieing Particle System

After deciding that each user was going to have their own particle systems (to ease the yet-to-be-added settings menu integration) and coding the initialization/ setup in the user class, I found an interesting problem. The particle system automatically generates a large amount of particle at the beginning and starts moving them around. When the particle's life expires, it respawns at the emitter's location....but the particle never truly dies, even when the user removes their touch. So, this had to be fixed (now that the system can recognize when the user removes a touch).

public void Update()
{
    boolean sUpdate = false;
   
    for(int i = 0; i < 10; i++)
    {
        if(Users[i].State == 1)
        {// a touch was added or is still active
            Users[i].InitSystem(world);
            Users[i].Update();
            //Move(Users[i]);
            sUpdate = true;
        }
       
        if(Users[i].State == 2)
        {// user removed a touch
            if(!Users[i].isDieing())
            {// if it's not dieing, then kill it
                Users[i].Kill();
                Users[i].Update();
            } else
            {// waiting for the last few particles to die
                Users[i].Update();
            }
           
            if(Users[i].State == 0)
            Log.i(TAG, "Removed user: " + i);
            sUpdate = true;
        }
    }
   
    Update = sUpdate;
}
InitSystem(): sends the current 3D world to the particle system so each system can handle adding/removing 3D objects on it's own.
Update(): Updates the current location of the pointer in the system (the Users[i] variable contains the touch location already and the particle system), moves active particles, and advances the stages of any active particles (color, velocity, life cycle).
Kill(): tells the particle system to not re-new any dead particles. The particle system also checks each update cycle if all the particles are dead yet or not.
isDieing(): returns if the particle system is in the process of dieing "return (particles.State == 2);"

The following is the Update(x,y) code which is called in the Users[i].Update() function

    public void Update(int x, int y)
    {
        this.x = x;// update the new location of where
        this.y = y;// new particles will spawn
      
        boolean dead = true;
      
        for (int i = 0; i < particles.length; i++)
        {
            Particle p = particles[i];
            if (p.lifetime <= 0)
            {
                if(p.lifetime == -1)
                    continue;// the particle is dead, so don't touch it
              
                if(State >= 2)
                {// don't make any new particle, preparing to kill this system
                    world.removeObject(p.object);
                    p.lifetime = -1;
                    continue;
                }
              
                // revive this particle
                newParticle(p);
            }
            dead = false;
          
            p.move();
          
            p.color.Decay(decay);// needs an update for using a color array
            if (p.color.RGB() < 0)
                p.color.Set(0);
            p.UpdateColor();
        }
      
        if(State == 2 && dead)
            State = 3;
    }
Ano
ther type of float?
I've also been reading about how integers with bit shifted values are faster to use than float values and I'm in the progress of testing this.
Example:
Instead of
float px = 15.82673;
it would become
int  px = 20 << 8; // which becomes 5120 in code
adding gravity to the variable would be normal, "px += vx;". But when using the variable to draw it to the screen, it will be right shifted back to normal.
int px, py;
Init()
{
    px = py = 100 << 8;// which becomes 25600 in code
}
loop()
{
    px += vx;// vx is accumulative
    vx += particles.gravity;
    Draw();
}
Draw()
{
    world.draw(particle.image, px >> 8, py >> 8);
}
Advantage:
- It has decimal like values while not needing to have a "(int)" before being used in int only functions
- Incrementing/ decrementing the value works the same as how a normal float would

Disadvantage:
- Reducing the total amount of values the variable can hold. A signed int after being bit shifted has a max value of 256 (due to java using only signed ints), while a float (with decimal places) is "2127"



Current progress actions

The thing I'm still working on now are getting the billboard action of each particle working....well, I only need to get the class working, everything else just falls into place. Currently, it all looks correct enough for testing, but I've yet to have time to test the app (due to making tweaks to the code and working on a project for another class).