Thứ Bảy, 30 tháng 11, 2013

Saturday Report

Advice Taken

I took my bodies advice and slept for over 12 hours, then logged on long enough to check for any urgent emails. The plan was to find nothing and log off immediately, and so to enjoy my weekend off.  Alas.

First thing I saw was that one of the internal versions had escaped, and was the star in a blog video on the FPSC-R forums ;) I knew it was an internal version on account of the sand cubes where corrupted, which has been fixed in the official final version of BETA 1.003. I also got an email from Rick green lighting this version for release, so I stayed logged on to supervise the release of the beta for today. Upload takes a while, so watch your Products page and your inbox for news of the new update.

Occluding In Brief

As you explore the new update, you will notice two things. One is that your frame rate goes up (or should) and the second is that some objects are disappearing (and reappearing) on you.

The visibility issue is due to the way the occlusion system has been created, and relies on the fact the larger invisible bound-box that surrounds each object will be visible BEFORE the actual object to which the box relates is visible to the player. Unfortunately there are situations where the player can turn a corner and surprise the system, meaning for a second the object is not there and then it appears. This happens for only one render cycles, but it is enough to spot the artifact!  I am currently looking for options to minimize this effect but all the Google links I found so far elude to the fact that this is acceptable and common for occlusion systems.  If anyone can find a link which identifies this issue, AND SOLVES IT, then please let me know through the comments section.

Another issue you MAY spot is that really distant objects now completely disappear. In the final version, objects at this distance are replaced with quads (two polygon billboards) to fake the object's appearance for increased rendering performance. Alas my R&D into the technique of generating the textures for these quads over-ran the planned release schedule for this update and so the quads had to be hidden for their own sake.

A Secret Tip

If you want to see what the quads look like, you can comment back in the two CLIP commands in the entity and quad shaders.  I only recommend this if you don't mind the technical challenge and the less than satisfactory visual effect.

A Special Thanks

All is not lost however as on the eve of the official 1.003 release, I received essential advice from my new best friend Matty, the author of DarkImposters who had already solved the issue of projecting objects onto quads and very kindly shared his secret sauce with me.

I have glanced through his equations and I 'just about' understand the technique used, and am now very eager to code this into the engine and see what happens.

The Question Of Performance

We still have a LONG way to go on the subject of performance, and I still have some hellish optimizations I want to attack the engine with. I did take a little time out to solve some legacy model issues and adding the store to start the whole legacy support ball rolling, but I am curious what the community (you guys and gals) would have me do at this point.

I could continue to work on PPP (Performance Performance Performance) at the exclusion of all else, or I could blend in a little non-Performance work such as more legacy support, Weapon HUDs (of which the art is now available to me), auto-fence builder, adding the RPG, e.t.c. I am sure I will do one or two of these things for the next update in December, but I am curious to what percentage the wider world would set for this division of labors.  Imagine it was another slider between 0% and 100% to control Lee's non-performance activity :)

Signing Off

I know it's against policy to blog at the weekend, but I have 450MB of data to upload and decided to give you an update while I waited for the files to leave Wales. I have a meeting Monday and Tuesday next week so I can't imagine I would have much Reloaded news to blog, but if anything does occur, I will be sure to include it!

Thứ Sáu, 29 tháng 11, 2013

Friday Finally

Captain Sensible

In order to put my degenerating over-worked body on a better footing, I started today as normal as I could manage. Got up at 7AM, breakfast, work, dinner, work, finish up and test, upload.  Knowing full well the quad situation could easily soak up all my remaining daily hours, I decided to shift focus to getting a solid update ready for the Alpha Cascade. It's uploading for an internal test now, and then I will be ready to uploaded to the main FTP this weekend if Rick agrees that you can all enjoy the performance improvements so far. He's also prepared to delay the update a week in order to add more polish to the release before you get your hands on it, so watch this space for that!

What's In Store

As I did not find the time to code the auto-fence maker, I quickly added in the Game Creator Store feature with a few modifications for Reloaded.


In the shot above, I have taken a rather cool refinery looking construct from the store and added it to a simple level.  I took the liberty of marking the item as Reloaded compatible as I have since fixed the collision and orientation issues that originally prevented this model from working properly. It is also the first and only Reloaded approved model in the store right now, but rest assured more will quickly follow now the legacy support is going into the engine as each new issue is identified and resolved.

Some other issues such as static object collision has been improved so there is no more falling through ladders when you try to make a bridge out of them, you can now climb the staircase object and you can no longer sneak through invisible gaps in the fencing.  I am still dubious about the dynamic collision shapes, but when I have more objects to test against I can put those through the mill.

Signing Off

I have temporarily disabled quad rendering, and enabled extra low LOD static buffers. The shadow, reflection, light ray and the main camera all use these dynamic static batch buffers now so you should see some speed improvement. I once reported my 'run to the river' demo went from 24 to 29 then up to 50. Well it went back down to 42 once I re-activated some needed rendering code, but it's still heading in the right direction.

I have since learned some new techniques and have had a few ideas of my own to solve the final quad problem, so hopefully we can re-introduce quads next week to gain some more performance.  I also have a mind to add some code to the terrain system to replace a field of polygons with two polygons AND apply occlusion techniques to the terrain segments too, to see if that helps performance.

Although occlusion is in and working, I am not 100% happy with it yet. You will read in many occlusion articles that the momentary 'popping in' of objects once the engine has determined they should be visible is acceptable and barely noticeable. Having tested it for many days, I DO notice it and I DON'T find it acceptable. The solution will probably hog some performance as I suspect I will have to do a pre-draw with the occluded shapes, stall the GPU then draw the 'now visible' once-occluded objects. When you get hold of the update, check out the artifacts on this and do send me any links you find from anyone in the world who has solved this 'popping in' effect without huge GPU stalls.

Thứ Tư, 27 tháng 11, 2013

Wednesday Or Thursday

Blurry Work

First up, thanks for all the calls to action and recommendations of rest, a very sensible idea and anyone listening would be a fool to ignore such sage advice.  
Unfortunately software seldom writes itself and deadlines have a habit of getting closer and larger, and it would not do to just leave the update half-finished which is why I must carry on and finish it.  Perhaps I can find a few days to rest after the update is out the door!

Perspective Pains

The more I work on this QUAD texture rendering, the more I am convinced my brain was not wired for 3D thinking. I've literally spent 10 hours on the same thirty lines of code, trying every combination of crazy ass ideas I can think of to get a section of render to re-render centrally at the correct angle and distance. You would think it was easy :)

The good news is that I am making progress, all be it slowly. I figured out some neat things like how you can scale up the perspective matrix to create a sort of zoom effect. I am currently working on a way to tweak the view matrix generation to tilt towards the target object that needs rendering as simply shifting the view camera was not enough.  I also need to handle the multitude of little attributes for things like resolution changes, aspect changes, camera position and angle changes, object dimension changes and then quad resolution and densities which can be adjusted in real-time.  It's clear to me that this is not a 'few days work' as I had originally thought and will take some considerable time and thought to become fully rounded and complete.

Signing Off

A nice cup of tea me thinks, and then see if I can get 'something' concluded from all this work, and build an internal version for Ricky baby.  The next update is imminent so internal testing is the plan, I only wished I had more done to test ;)

Thứ Ba, 26 tháng 11, 2013

Tuesday At Quadington

Short Day

It's very rare I allow physical discomfort to stop me coding, but for whatever reason Tuesday has been a real mare for me. Might be overwork, or just the weather, but my energy levels are barely registering.  I did however have the professional sense to make today count so I booted up and carried on regardless for a few hours.

Quad Progress

More progress made here, with each quad now having the correct UV coordinates assigned, the viewport system rendering a single object to each gap in the community texture and I have started the code to correctly align the view camera with the target object.

I am currently struggling with ensuring the view camera alignment remains unaffected, and to work out the optimum distance to place the camera from the object to encompass the whole object and match it perfectly with the quad size too.  Doing this correctly means the quad system will work from any camera, and any angle, and any quad size.

I remember my early days at school and I don't think we did advanced 3D maths at GCSE level, but I wish I did.  For what might take a modern programmer five minutes to work through the relationships between all my vectors and matrices, will take my brain a considerable amount of time, usually through trial, error and stubbornness.

Other Bits

Also managed to catch up with my emails, do this blog and start one or two plates spinning so no-one is waiting on my feedback.

Signing Off

Hopefully Wednesday I will feel better. For now, I am going to put the kettle on, watch an episode of Time Team and then get an early night.  I did not sleep a wink (that I remember) in the last 36 hours so hopefully I can drift off. The quad system is the last visual hold-up to the next update, as after this it will be clean-up and testing which requires less 3D math and more donkey.

Thứ Hai, 25 tháng 11, 2013

Monday Quadz

Houston, We Have A Quad Texture!

After much tinkering inside the bowels of the engine, we now have quads using a render target texture created entirely on the video card and rendered with the object that created it.  Here is the glorious mess for your enjoyment:


As you can see, we are rendering the whole texture for each quad, and rendering all the objects to the target surface as they happen.

The next step, now we can see something along the right lines, is to sub-divide the texture into grid squares, to use a view-port to only render to the correct square, to only render the object once per generation request, to modify the quad UV to only show the texture that applies to it's own original object, and so on.  I had thought I would be further along, but it's always a good place to be when you render something and you can actually 'see it'.

Signing Off

It seems my body is once again on night shift, so I write this at the end of my day which is about 5AM.  Hopefully I can get up before the sun goes down as I so light to see some daylight before I start my day.

Chủ Nhật, 24 tháng 11, 2013

Sunday Pottering

Legacy Bits and Bobs

After spending MOST of yesterday playing FEAR 2 and the start of today watching an episode of The Planets, I decided to boot up the ol' computer and do some non-performance coding to give my brain a break.


What you are goggling at are three objects from the Game Creator Store, which is being integrated for the next update.  To this end, I spent half of Sunday making sure some legacy models and entities would load in first time without modification.  I am certainly not going to make Reloaded backwards compatible with everything in the store, and will be adding a filter by default to show only Reloaded content, but this filter will be adjustable so you can find older legacy static entities and see how they fair (including any that you have bought previously via FPSC Classic).

Occlusion Works

Also made some continuing fixes with the new renderer to stop it flickering due to multiple queries and cameras confusing it.  I am saving the true imposter work until Monday as that's a full work day of headache that one!

Signing Off

My plan for this coming week is to have a solid version by Wednesday, on which plenty of testing can be performed THU and FRI. The old regime of releasing new updates that break old projects should really be put to rest here at TGC, and we do that by spending two days NOT adding features and JUST testing (that is, if I don't get strong armed into adding anything during those two days).  The objective is stability, consistency, THEN performance, THEN the store integration. You can then be at least assured that the update is not break stuff you have working right now.

Thứ Sáu, 22 tháng 11, 2013

Friday Foundations

Lee Stops Digging

After many days of performance tuning and investigating, I have finally got to bedrock on everything that spends GPU calls.



As you can see, when looking up at the sky there are only SEVEN draw calls now being made. Four of those are sky box surfaces (one for each 'side' of the sky being rendered), one for the post process quad to render the main camera to the screen, one for the dummy terrain object which is needed to regulate the terrain shader system and finally one for the world sized water plane.

I also discovered the cause of the 'super high drain' from the post processing which was caused by the lightray camera rendering everything, even if you switched it off in a previous session, now fixed.

After hacking my last few singular calls out, and noticed no performance gain beyond the high 300's mark, I decided to leave them be.  I am now curious what FPS scores users will get from a blank scene looking at the sky with everything switched off. I agree it's not the makings of the next killer game, but it will be very interesting to see how low-end systems handle this little experiment, and if they DO perform badly, then we can start to look elsewhere for the culprit such as the CPU, swap files or even the panel that displays the score in the first place :)

Further Ideas

There are one or two further ideas beyond this minimalist position, such as using the occlusion to detect if the water plane pixel(s) was rendered, and if not, hide until the occluder says it is visible once more.  The reason I am not pursuing at this time is that with the new front to back draw order, all of the underground water plane is Z-rejected instantly so does not hit performance except for the extremely minor call hit.

Moving On

Now I know every draw call personally, and can account for everything the GPU is doing, I can start to build things back up.  One of the first will be the True Imposter System which will create community textures for each Instance Stamp buffer and then GPU render objects into them to provide my quad textures.  It's a priority as right now I am using a place holder 'building' texture for the quads and allowing that to be in the next update would make all your screenshots from that version very odd indeed.

I also have it penciled in to add some more legacy support too, but more on that when I discover I have the time to do what I have planned. It should be a welcome addition if it all works out :)

Signing Off

Well I did say it was going to be all Performance, Performance, Performance and I think you will agree I have been rather single minded on this point. There is more performance to be had elsewhere such as faster physics for polygon style objects, faster AI by diving into the source code and seeing how I can save speed and of course going through each shader and seeing if I can write less-hungry ones that produce similar results as the top end ones. Hopefully before too long we'll get to the point where it 'just works' for most of you and we can start on the really exciting stuff like alternative camera views when editing, completion of the weapon systems and refinement of the character and AI behaviors.

P.S.

Also, just backing up for the 'evening' and realized I was showing a debug release metric shot. Compiled in release mode and got a slightly better FPS for the sky-test ;) 



I also quickly loaded my 'run to the river' level, and originally I was getting 24fps fully loaded, and after later performance work I got 29fps on Wednesday. Just tried with this 'evenings' version and with everything switched on I am now getting 50fps!  This was one of my pet goals, to get this small level playable, and it certainly is now.  I still want to hit 60 fps, and once I look more into AI, Physics and Shader optimizations I am sure I will not only get this but exceed it.

A good week for me. Until Monday, have a good weekend! More to come next week when we start our live-quad adventures, and if we're lucky, our half a million trees test!

Solve for X: Help us work towards a radically better future

If we’re going to solve some of the world’s biggest problems, we need more people and teams to take on "moonshots"—audacious projects that create 10x improvement, not 10 percent. Part of that involves encouraging and celebrating the audacity of the attempt. So last week we partnered with Majority Leader Eric Cantor and Democratic Whip Steny Hoyer to run a special session of Solve for X on Capitol Hill in Washington, D.C., to discuss and debate audacious science and technology moonshots that could transform the world.
Neil Gershenfeld holding up a "Pop FabLab" with Nadya Peek as an example of how 21st century manufacturing won’t just happen in large factories, but out of portable briefcases like this one.

Solve for X is a community of individuals and organizations that work together to accelerate progress on moonshots—and what better group of people to work with than those already thinking about our country’s future? As Susan Molinari, our VP of public policy and government affairs, said at the event: "Policymakers are trying to solve big, intractable problems—and so are engineers. Engineers are tackling challenges that have no answers to date, and so are our policymakers.”

The D.C. event brought together a group of exceptional technologists, entrepreneurs, polymaths-at-large, AAAS fellows, Members of Congress and their staff. Pioneers in their respective fields proposed moonshots in manufacturing, STEM (science, technology, engineering and math) education, and access to natural resources:

  • Neil Gershenfeld and Nadya Peek from the MIT Center for Bits and Atoms proposed setting up FabLabs to bring manufacturing back to America—and in a way appropriate to the 21st century
  • Theresa Condor from NanoSatisfi proposed an inexpensive way to give all students direct access to personal satellites to conduct their own science experiments and to transform adoption of STEM
  • Chris Lewicki from Planetary Resources proposed mining asteroids for natural resources

Majority Leader Eric Cantor (above) said in his opening remarks: “We work in a world of problems, that frankly, any given day somebody could tell you we’ve got a solution for—it’s just about summoning the will to try and actually accomplish it.”

Following the moonshot proposals, we broke into small groups to brainstorm resources, technology and people that could help make the ideas better and happen faster. At Solve for X, brainstorming means two-thirds "yes and"—creating stepping stones to build on an idea—and one-third "yes but—providing critical feedback on blind spots or suggesting alternate implementations.

Democratic Whip Steny Hoyer told us: “You have a psychology of creation. A psychology of ‘what can we imagine?’ And then make it be reality. And that of course is the kind of thinking we need.”

Solve for X co-creators Astro Teller and Megan Smith closed out the event reminding us that moonshots can come from anywhere—people of all ages and places, companies, academia, government, inspired experts, enthusiastic newcomers, even accidental discoveries. So join our 160 moonshot pioneers by submitting your own moonshot video, and contribute to our conversations on Google+ and Twitter—we'd love to hear from you.

Posted by Puneet Ahira, Moonshot Evangelist and Project Lead, Solve for X

Thứ Năm, 21 tháng 11, 2013

Thursday Tinker

Meeting Day

With a six hour talking appointment starting at 8AM, there was not much time left in the day for coding, but I managed to look into a few small areas.  The issue of the FISH-EYE issue has now been fully explored and the problem was that the Reloaded engine was passing in a horizontal FOV, but using it in the perspective matrix calculation as a vertical FOV value, and the aspect was fixed at 1.777.  The new version sets the aspect ratio based on the resolution you are running under, and the FOV is now providing a vertical angle. By default now, the buildings and high walls don't skew out of all proportion when you look at the sky, and for those users who want the old style back, just ramp up the slider until you get what you want.


Now when you look up, the fish-eye effect is virtually gone and you get the kind of perspective you might expect from more traditional games that set a reasonable 85 degrees for horizontal FOV rather than the 170 degrees that I was using before.

Where Those Draw Calls Go

I also wanted to find out why an empty terrain scene took 73 (SEVENTY THREE) draw calls to perform, and so before Thursday was out I decided to make a shopping list of what GPU calls I was making. Here they are:

12 draw calls for the two overlapping skies (day and night, one side per call)
48 draw calls for the BlitzTerrain module segment renderings (one viewpoint)
3 terrain object calls (to populate the texture painter, editor water plane, etc)
1 water plane draw call which lies under the flat terrain
9 draw calls for the nine-stage post process for Bloom, Etc
1 vegmap draw call to paint the texture paint to the paint camera

As you can see, we have a few questions to ask of our engine here. We don't need 12 calls for the sky, and when not transitioning, one sky mesh would do. So that's 11 calls that can be saved. The editor water plane is not needed in game, so should be removed, as well as the texture painter when in-game! A simpler post process shader could manage Bloom in fewer that nine passes I am sure. All told, I could probably shave off 18 draw calls from the above 'empty' scene. I also want to investigate the 48 draw calls to produce the terrain, just to make sure they are not being rendered twice, and at least to confirm I need 48 segment meshes. Maybe I can find some settings to fiddle with in this regard too.

Signing Off

Quite tired after today's various antics, so I will be starting my research into True Imposter technology on Friday, which I hope the fill in the last of the visual gaps in my performance boosting trinity and get those distant distant QUAD polygons textured properly. I had thought of fake QUAD textures, but it would take me as long, and probably longer to create a system to produce the fixed baked textures to accompany each entity you bring in.  I have a plan, and it should not take too long, so it's worth spending a day on.

It was also aired that perhaps progress is not as fast as the community may wish, and perhaps too much time is being given over to pure performance work. I think however that in the long term this 'boring' performance work will repeat rewards much later on when we take high frame rates and clever scene handling for granted.

Check off your holiday gift list with Google Shopping

Thanksgiving is just a few days away, and if you’re like us, you still have loads of stuff to buy on your expanding holiday gift list. If you can’t imagine braving the crowds to get everything picked and purchased, don’t worry: our elves have made some improvements to Google Shopping in time for the holidays.

Find the hottest toys and get inspired with our holiday shortlists
We’ve curated holiday shortlists for top gift categories including Hot Toys, Electronics, Google & Android Gadgets, Gifts for the Home, Gifts for Her and Gifts for Him. For those of you looking for something to delight the kids in your life, here are some of the top trending toy searches on Google Shopping this month:


And for those of us wanting to relive our childhoods, we’re also seeing many nostalgic toys making a comeback this year:


Browse more quickly and easily
A new shopping experience on desktop, tablet and mobile makes it easier than ever to browse and hone in on items you want to buy, whether it’s a camera, a ski jacket or an ice cream maker. Click on a product to preview details like sizes, colors and description, and find out if it’s available at a nearby local store. If you see an item that’s almost perfect but not quite, click to view “visually similar” items.

Make your shortlist and check it twice
Shortlists help you keep track of products that catch your eye, compare them at a glance, and share ideas with friends and family. Your shortlist now also stays with you at the top of each page while you browse Google Shopping, so you can keep track of items as you go.

Check out the product from all angles
Sometimes it’s hard to imagine what an item actually looks like from the online picture. Now, for many items on Google Shopping, you can see a 360° view of the products. These interactive images bring you the in-store feeling of holding and touching a product.
Once you’ve got something for everyone on your shopping list, we encourage you to buy something for yourself. You’ve earned it.

Thứ Tư, 20 tháng 11, 2013

Wednesday Delight

A Sweet & Sour Day

No I am not having a Chinese meal, I am summarizing my adventure from testing the all new 'integrated' batch buffer and occlusion system. I swapped in the required code, made sure it did what it was supposed to do at the time it was supposed to do it in, and LO, I went from 24fps to 29fps. Sigh.

You can almost imagine the OOOs and AAAs in the meeting on Thursday when I display my 5fps increase for all the office to see!  That was the sour.

The Sweet

I decided to pick my confidence off the floor and start from the beginning, creating a flat landscape and then populating it with a few hundred high polygon trees.  My current Beta 1.002 update rendered the amazing vista at 42fps (with everything switched off) and an astounding 10 fps with everything switch on. I re-instated my new code which optimizes the terrain and uses the triple whammy of GPU occlusion, geometry batching and QUAD system and my new statistics on the exact same level was 100fps with everything on and 155fps with everything switched off.  Silence.  Basically, a 1000% increase in performance for that level :)

Lee's New Mission

I still have to sort out some textures for the quad polygons and maybe fiddle with the transition distances (ultimate for a new slider), but I had a new mission from my mixed results.  I was going to account for every draw call and every rogue performance metric in the system, to find out WHY my run-to-the-river level is not showing the same increase in performance.  As you can tell, I am not quite finished with the 'make the engine seriously fast' work, and hopefully you can bear with me as we go from small victory to small victory in the pursuit of a decent games engine.

Signing Off

I am conscious of the fact I have a 7AM wake-up call tomorrow so I cannot indulge in a large blog post :)  All in all, pretty happy with my performance boost on some levels, and I know pretty much what I am doing over the next few days to get some more!!

Bringing hobbits, dwarves and dragons to screens everywhere

You don’t need to be besties with a Wizard to share an adventure in Middle-earth—just point your favorite browser to goo.gl/TheHobbit on your laptop, phone or tablet to check out “Journey through Middle-earth,” the latest Chrome Experiment.

Inspired by the upcoming motion picture "The Hobbit: The Desolation of Smaug,” “Journey through Middle-earth” brings the locations and characters from the movie trilogy to life with a mix of modern web technologies. It was developed by North Kingdom in collaboration with Warner Bros. Pictures, New Line Cinema and Metro-Goldwyn-Mayer Pictures.
Your adventure starts on a beautiful, interactive map of Middle-earth. Zoom in to explore Trollshaw Forest, Rivendell and Dol Guldur (with more locations to come in the next few weeks). Click on each one to learn its history and meet the characters who inhabit it, or dive further to test your wits on a unique survival challenge.

The immersive 3D graphics in “Journey through Middle-earth” were built with CSS3 and WebGL, which you might recognize from previous Chrome Experiments. But “Journey through Middle-earth” is the first Chrome Experiment designed to bring this beautiful, 3D experience to mobile, thanks to support for WebGL in Chrome for Android on devices with high-end graphics cards.

The rich audio effects and sound manipulation are delivered through the Web Audio API, which is now supported on both Chrome for Android and Chrome for iPhone and iPad. Although WebGL isn’t supported on iOS, Chrome users can still experience most of “Journey through Middle-earth” on their iPhones and iPads. We can’t wait to see what sort of rich experiences developers will build as modern web technologies become available on more types of devices.

Circle +Google Chrome to stay updated as more Middle-earth locations get released in the coming weeks. You can also check out the Chromium Blog and read the team’s technical case study if you feel like geeking out a bit more.

Adventure is a click away. Just watch out for the trolls!

Posted by

Thứ Ba, 19 tháng 11, 2013

Tuesday In QUAD World

Some More Nice Performances

The star if the show this evening was my friends the QUADs. These little pair of pesky polygons helped me fill the gap caused when the LOW LOD models disappeared from the scene to save render drain. They look much like the LOW LODs but are entirely flat. They are also so far away you don't really notice they are flat, or much care, when your attention is on the foreground. Here is my horrid little prototype, with visuals only a mother could love.


As an antidote to two developer shots in two days, Rolfy comes to the rescue (once more) with THREE new skies for Reloaded. Amazingly, he gifted them to us completely free to be used for the Reloaded product, and the moon one is my new favorite!  One day I hope to add the technology to make our moon (and sun) round ;)


I could go into the amazing levels of brain numbing detail, but it's 3AM and my pizza is in the oven, so this will be brief.  The new QUAD buffer work completes the polygon fill for the screen and all that remains (all he says!) is to orient the quads to always face the camera and to render the contents of the quad with the correct little texture to fool the user into thinking it is still the LOW LOD model you are looking at.

I was going to fake it with seven angled shots of each texture, but the more I think about the cleverness of the Dark Imposter (not Dark Occlusion - which is also pretty clever), the more I am keen to at least TRY to see if I can get each quad to have an accurate representation of the original model.  So what if it takes 100MB of texture memory, hits the performance for a few FPS, the finished render will be gorgeous, with almost polygon perfect accuracy across a vista stretching over a mile wide.  I must keep reminding myself to constantly aim HIGH!

Signing Off

A few more minutes before I burn my pizza.  Wednesday I will finish the last two QUAD tasks then see if I can integrate the lot into the engine to see where we are. I am not too precious on this point and won't leave the prototype until it does everything I need it to do, and I have a meeting on Thursday to show my wares.  I was hoping to show the final software with a huge speed increase, but given the complexity of what I've had to go through, I am now very happy if it's a complete and competent prototype that gets demonstrated.  I can smell, burning, so time to sign off :)

Thứ Hai, 18 tháng 11, 2013

See all five copies of Lincoln’s handwritten Gettysburg Address on the Google Cultural Institute

Not quite four score and seven years ago, I was an elementary school student, staring at a classroom map, gripped by the (mistaken) deduction that since Los Angeles was in the southern half of the country, Civil War battles must have clattered on the ground outside my home. While a teacher eventually helped me understand that California wasn’t in the Confederacy, the moment led me to understand the weight of history and that it has shaped the world into what it is today.

Today, on the 150th anniversary of the Gettysburg Address, we’re helping make the past come a little bit more alive. Three new exhibits now available on the Google Cultural Institute focus on President Lincoln and the 272 words that shaped a nation’s understanding of its identity. Thanks to our friends at the White House, the Lincoln Library, Cornell University, Dickinson College and the Library of Congress, you can browse high-resolution digital versions of all five Lincoln-handwritten copies of the address. You can also:


Comparing two copies, side by side

You can also contribute your own version of the Gettysburg Address to Learn the Address, a project by documentarian Ken Burns, who has also been reaching schoolchildren across the U.S. with Google+ Connected Classrooms.

Most of us will never stand in the Lincoln Bedroom and see the handwritten draft exhibited there. But now anyone with access to an Internet connection can explore all these artifacts from this defining moment in history—perhaps a bit more accurately than when I gazed at that map.

Monday Occlusions

Performance In The House

My plan for Monday was QUAD batches, so of course by the end of Monday I've just put the finishing touches to the new GPU Occlusion and Culling system for Reloaded.  Now prepare yourself for a prototype screenshot, so those of a weak disposition, please avert your eyes now.


As you might see, the numbers in the top left tell an interesting story about where we are in engine terms. The first number to note is the O value, which tells us how many buildings I added to the 50,000 x 50,000 world. Each building has three LOD levels of 5000 polygons, 4000 polygons and 3000 polygons (roughly). The next important number is 310MB, which shows we only used 109MB to render all the buildings within visual distance, and have been created in real-time as part of the dynamic buffering system. As we move through the world, this memory usage will not increase much more than this if the spread of buildings stays consistent as buffers are efficiently reused.

My favorite statistic is CALLS:4 which tells us that only FOUR draw calls where made to create the scene, and the POLYGONS of 24796 shows the population within these four drawn buffers. A quick calculation and you will figure out that we have static geometry batching going on as well!

The 1750 fps is a little deceptive given we don't have the rest of the engine in play, but when you consider that before the above techniques the same scene was rendering over 40 draw calls, nearly half a million polygons and a frame rate of around 400-500 fps.

GPU Occlusion

This worked better than I thought and really takes advantage of the speed increase you can get when you draw to a scene (in front to back order) using a non-color write mode to get the query to say if pixels where drawn.  

Amazingly after lots of careful thinking and slow coding, the engine worked first time out the gate. It was a shock to see the call rate drop to single digits and everything still rendered as though EVERYTHING was being rendered. Neat.

Signing Off

As tempting as it is to move this over to the main engine and test against a variety of entities, sizes, textures and so on, I really want to finish the render scene with QUADs. Right now the LOW LOD just disappears after a certain distance and is not replaced with anything, so they just ping out of existence. I had a mind to prototype using the Dark Occlusion module which provides this functionality out of the box, but on deeper reflection I would need it very closely integrated into the instance stamp occlusion system I have now and there would be some compliance issues in the middle of all that.  

I will have it as my fall-back, but for Tuesday I am hoping I can create a large static pool of quads and write a shader to have them face the camera with the right texture. Generating the texture will be another matter entirely, and it will be a dice throw as to whether I go for my previous idea which was seven angles around the Y axis and one from above (for the shadow cameras), or opt for the more sophisticated technique used by Dark Occlusion which renders each object perfectly to a communal quad texture, and only refreshes the render when absolutely necessary. The result is a MUCH more believable transition in the far distance, and is perhaps the hallmark of a quality future proof engine. My concern is that if you place down 100,000 trees, that's rendering 100K of entity views to a sequence of community textures, and that is only going to consume video memory and real-time performance.  

My dilemma is whether the extra time to test the more advanced technique is worth it, or whether it's more important to get everything in the main engine as soon as possible, and continue testing and improving from that base.

From your CS class to the real world: a deep dive into open source

Today marks the start of Google Code-in, a global online contest for pre-university students (13-17 years old) interested in learning more about open source software. Participating students have an opportunity to work on real world software projects and earn cool prizes for their effort.

For the next seven weeks students from around the world will be able to choose from an extensive list of tasks created by 10 open source projects. Some tasks require coding in a variety of programming languages, creating documentation, doing marketing outreach or working on user interfaces.

Participants earn points for each task they successfully complete to win T-shirts and certificates. At the end of the contest, 20 students will be selected as grand prize winners and flown to Google’s Mountain View, California headquarters. Winners will receive a trip to San Francisco, a tour of the Googleplex and a chance to meet with Google engineers.
Google Code-in 2012 grand prize winners at the Googleplex with a self driving car

More than 1,200 students from 71 countries and 730 schools have participated in Google Code-in over the past three years. Last year, our 20 grand prize winners came from 12 countries on five continents!

We hope this year’s participants will enjoy learning about open source development while building their technical skills and making an impact on these organizations. Please review our program site for contest rules, frequently asked questions and to get started.

Posted by Stephanie Taylor, Open Source Programs

Thứ Sáu, 15 tháng 11, 2013

Friday Performance Work

The Return Of Instance Stamps

As part of my batching code, I returned to a batching system I did a while ago for the old segments called Instance Stamps. The idea that instances are converted to reference data within a large grid, and the geometry is only created and rendered on a need to know basis.  

I had to heavily adapt it to the dimensions and unique style of the Reloaded level size and preferred ranges, and took all day, but the end result was a prototype that positively eats polygons for breakfast.  My last demo had 7800 combat buildings added all across the world, with HIGH and LOW LOD buildings being rendered in batches only where I was standing at the time. The draw calls where less than 50 and my frame rate was well over 500 fps. yes it included the actual entity shader, but did not include all the other gump required to run a games engine. The real teller was the impact of the dynamic creation as you moved through the world which kept a smooth motion even at 500 fps. One fear of course is that there is so much new geometry being dynamically created that it slows down and jitters the smooth running of the engine. So far, so good.

Next Steps

The two next steps are QUAD buffers for the far distance and OCCLUSION volumes for the stuff immediately around me, drawn front to back starting with the HIGH lod, then LOW lod, then QUADS, and all through the filter of a camera frustum. All these together should allow me to render only a handful of polygons for the view you are current watching, and disregard much of the rest of the level. It should also present very little performance hit as the stuff is created and managed, especially as the front to back sorting is done on a permanent visible object list so only needs to tax the processor when a change is required (which happens in small bursts as you move around and the order changes).

Signing Off

I am happy with the weeks work. Even though my two hour read-up on spacial databases has not contributed directly to the speed up work being undertaken it has given me a sense of clarity on how to approach a situation to gain speed in simple ways. It also means when the time comes to do quick searches based on spacial items in the world, I have this knowledge to fall back on. I even discovered a cool query you can do with spacial databases which is to project a line through 2D collection of boxes and it will instantly report which of those boxes where involved in the intersect test. Much better than a for next loop through every object in range of the bullet ray(s) and can quickly detect when it hits something solid and only check for dynamic entities within that reduced range!  It's the little things that will give us that super smooth performance, and even if I cannot afford the distraction of coding these ideas, making a small note in the source code will ensure I can come back to these little pearls.

I will be attending an Intel Game Fest this Saturday morning for 1 hour to help out would-be Android developers understand the wonderful world of AGK. If you know me, you know I can talk for an hour on practically any subject, but it is usually a rant or a long verbal waffle. Being center stage for an hour and not repeat myself too many times will be a challenge.  Should be a nice end of week distraction to kick-start my weekend.  It's about 1:30AM now and my Hangout is at 5AM, so I am going to prepare some AGK material for half an hour and then spend the rest of the time adding a QUAD buffer idea I just had to my instance stamp engine prototype.  Sorry there is no video or shot, but when I grabbed it, the quality was to 'programmer art' I dare not inflict it on you.  I think the next substantial shot will be a new view of my 'run to the river' demo with a substantially better FPS score :)

Thứ Năm, 14 tháng 11, 2013

Thursday Spacial Indexing

A Rare Day

It is not often I read up on contemporary literature of the day, as my old school habits lead me astray towards the 'do it yourself' approach. Today however I was impressed with the promise of a C++ library called BOOST to spend some time on pure reading.

A pretty well rounded library, and a fiendishly fast set of routines to speed up spacial searching and database creation. After 2 hours of reading, I had learned the API and all the examples, and was ready to do some prototyping. I then decided to sketch out what I needed to build a much faster scene management system for Reloaded.

After two pages of A4, it was apparent that for my initial coding I did not need a spacial query system as the current system I was using in the Instance Stamp system was FASTER than the BOOST library, and used only marginally more memory.  I thus decided plan A was to re-activate the Instance Stamp solution with the changes required to reflect the way static entities exist in the current engine. So I don't chew off too much, I have put occlusion and quad engine to one side so I can focus on the 'up close and personal' polygons that would surround the player immediately in the scene.  From there, I can create bounding volumes of geometry to reflect the instance stamp buffer collections and do the occlusion, and finally fill the missing 'distance' objects with quads. My sketch told me the order of the work as well as highlighting the specific details of what was missing to get from task A to task B.

Instance Stamp

As a recap, instance stamps is the idea that instances of objects are placed in a reference grid, and when the player is close enough to these references, a dynamic copy of that geometry is created in a common buffer for rendering. As it moves out of view, the dynamic buffer is destroyed and as new references come into view, they are created using the discarded buffers. Essentially the level can be extremely large without consuming a large amount of static geometry memory.  The performance hit is the dynamic creation, which if kept low enough becomes completely unnoticeable (and in prototypes have proven I can achieve hundreds of FPS no problem).

It's about 9:15PM now and my plan of a regular day went out the window today as I only rose after dinner time, meaning my day was already on a skew. I also have a 5AM appointment on Saturday which forces me to either stay up amazingly late on Friday, or go to bed insanely early to get up on Saturday before the larks and seagulls.  I have no doubt I will manage it either way, but it makes for an interesting schedule between now and then. It is for this reason, and while my brain is keen, to get some of the instance stamp engine coded now so I can see some early results.  All the commands are still in place from last time, so it's just a case of familiarizing myself with the old prototype, and migrating those ideas over to how the static entities are situated right now.  Hopefully before Friday I can report a speed up :)

Signing Off

Looking back on my blogs, I must apologies for the lack of screenshots and lovely eye candy.  This is the essential truth of the programmer, and what most weeks actually looks like down in the coding dungeon. Lots and lots of ideas, much code and very little visuals to show for it.  You only get visuals when some PR guy taps you on the shoulder and asks for something 'real' to show :)  Alas, this is the way of the world.  Until my next 'real' blog, I shall continue to crank out more of this alien gobbledygook and look forward to the day when I can report some serious performance boosts.


In the meantime, here is the small scene I am using as a template to first test my instance stamp batcher and ultimately my occluder.  As you can see, from an FPS point of view, and with occlusion working, very few of the buildings should ever actually be rendered and result in low polygons and draw calls, and high FPS as a result. Here's hoping!!

Solar in California and Arizona: More of a good thing

You’d think the thrill might wear off this whole renewable energy investing thing after a while. Nope—we’re still as into it as ever, which is why we’re so pleased to announce our 14th investment: We’re partnering with global investment firm KKR to invest in six utility-scale solar facilities in California and Arizona. Developed by leading solar developer Recurrent Energy, the projects have a combined capacity of 106MW and will generate enough electricity to power over 17,000 U.S. homes. Google will make an approximately $80 million investment into these facilities.
The 17.5 MWac/22 MWp Victor Phelan project (pictured), located in San Bernardino, Calif., is part of six Recurrent Energy developed projects acquired by Google and KKR. The six-project portfolio is expected to operational by early 2014 and will generate enough clean electricity to power more than 17,000 U.S. homes.

This investment is similar to one we made back in 2011, when we teamed up with KKR and invested $94 million in four solar facilities developed by Recurrent. Those facilities have since started generating electricity, and we’ve committed hundreds of millions more—more than $1 billion in total—to renewable energy projects around the world.

These investments are all part of our drive toward a clean energy future—where renewable energy is abundant, accessible and affordable. By continuing to invest in renewable energy projects, purchasing clean energy for our operations and working with our utility partners to create new options for ourselves and for other companies interest in buying renewable energy, we’re working hard to make that future a reality.

Government requests for user information double over three years

In a year in which government surveillance has dominated the headlines, today we’re updating our Transparency Report for the eighth time. Since we began sharing these figures with you in 2010, requests from governments for user information have increased by more than 100 percent. This comes as usage of our services continues to grow, but also as more governments have made requests than ever before. And these numbers only include the requests we’re allowed to publish.
Over the past three years, we’ve continued to add more details to the report, and we’re doing so again today. We’re including additional information about legal process for U.S. criminal requests: breaking out emergency disclosures, wiretap orders, pen register orders and other court orders.

We want to go even further. We believe it’s your right to know what kinds of requests and how many each government is making of us and other companies. However, the U.S. Department of Justice contends that U.S. law does not allow us to share information about some national security requests that we might receive. Specifically, the U.S. government argues that we cannot share information about the requests we receive (if any) under the Foreign Intelligence Surveillance Act. But you deserve to know.

Earlier this year, we brought a federal case to assert that we do indeed have the right to shine more light on the FISA process. In addition, we recently wrote a letter of support (PDF) for two pieces of legislation currently proposed in the U.S. Congress. And we’re asking governments around the world to uphold international legal agreements that respect the laws of different countries and guarantee standards for due process are met.

Our promise to you is to continue to make this report robust, to defend your information from overly broad government requests, and to push for greater transparency around the world.

Thứ Tư, 13 tháng 11, 2013

Street View floats into Venice

Venice was once described as “undoubtedly the most beautiful city built by man,” and from these pictures it’s hard to disagree. You can now explore panoramic imagery of one of the most romantic spots in the world, captured with our Street View Trekker technology.

It was impossible for us to collect images of Venice with a Street View car or trike—blame the picturesque canals and narrow cobbled walkways—but our team of backpackers took to the streets to give Google Maps a truly Shakespearean backdrop. And not just the streets—we also loaded the Trekker onto a boat and floated by the famous gondolas to give you the best experience of Venice short of being there.
Our Trekker operator taking a well-earned rest while the gondolier does the hard work
The beautiful Piazza San Marco, where you can discover Doge's Palace, St. Marks' Cathedral, the bell tower, the Marciana National Library and the clocktower

We covered a lot of ground—about 265 miles on foot and 114 miles by boat—capturing not only iconic landmarks but several hidden gems, such as the Synagogue of the first Jewish Ghetto, the Devil’s Bridge in Torcello island, a mask to scare the same Devil off the church of Santa Maria Formosa and the place where the typographer Manutius created the Italics font. Unfortunately, Street View can’t serve you a cicchetto (local appetizer) in a classic bacaro (a typical Venetian bar), though we can show you how to get there.
The Devil’s Bridge in Torcello Island

Once you’ve explored the city streets of today, you can immerse yourself in the beauty of Venice’s past by diving deep in to the artworks of the Museo Correr, which has joined the Google Cultural Institute along with Museo del Vetro and Ca’ Pesaro - International Gallery of Modern Art.
Click on a pin under "Take a tour" to compare the modern streets with paintings of the same spots by artists such as Carpaccio and Cesare Vecellio
Or delve into historical maps of Venice, like this one showing the Frari Church, built in 1396

Finally, take a look behind the scenes showing how we captured our Street View imagery in Venice.

The Floating City is steeped in culture; it’s easy to see why it’s retained a unique fascination and romance for artists, filmmakers, musicians, playwrights and pilgrims through the centuries—and now, we hope, for Street View tourists too.

Global Impact Award to improve veterans’ higher education

When veterans return home, a college degree is often a great next step for a successful transition to civilian life. But college can be a tough place for veterans, especially when they’re juggling classes with personal, family and financial pressures. Unfortunately there’s very little data about what can help veterans thrive in school. We want to change that.

Today, we’re granting a $3.2 million Global Impact Award to the Institute for Veterans and Military Families, Student Veterans of America, the Posse Foundation and Veterans of Foreign Wars to support data analysis of U.S. veterans’ higher education. The study will be made public and answer critical questions:

  • Which colleges are most successful at supporting veterans through to graduation day?
  • What on-campus programs have the biggest impact?
  • How do veterans’ education majors stack up against employment opportunities?

Based on the report, we’ll fund the expansion of the veterans’ programs found to be most effective—whether it’s on-campus child care, access to dedicated mental health services or physical gathering spaces—and will also provide Googler support to make this project a reality.

This award builds on our work to train and mentor student veterans through the Google Veterans Network. We’re proud to serve those who’ve served.

Wednesday Quad Reducing

Performance Is King

As a mere subject of the king, my work today was rather dull with plenty of nothing to screen shot for your delectation! I had a big six hour thought last night due to insomnia and came up with a radical plan to transform performance metrics in Reloaded. It involved a bottom-up approach to the solution which is 'what don't I need to draw right now'. 

Given this simple premise, I started by getting rid of all the objects until I was left with the floor and sky.  The terrain floor was happily gobbling up over 30K in polygons for a completely flat floor.  I immediate remembered the QUAD REDUCE function of the Blitz Terrain system so decided Wednesday would be about using that feature. I already knew it could not simply be turned on and I had to be flagged before passing in a height-map.  Generating and testing that height map took most of the day.

The Performance Plan

Essentially I am going to build a spacial R-tree database of all entities, create 3D box volumes for the enclosed groups of entities, then generate pools of quads that share a communal texture. The quads and communal texture is generated just before the game starts and will take very little memory. I then re-use my Instance Stamp system batch together all the HIGH and MID static entities relative to the local position of the player, which means we only create what we need to see within out close range. I then use GPU occlusion detection with the r-tree volumes to work out what I don't need to render, and then add dynamic entities only where volumes are visible for that cycle. The quad will have multiple textures depending on which angle you are looking at it, and one for when looking at it from above. A shader will control the quad rotation and texture selection to avoid writing into the vertex buffer, and the same batches of static buffers and quads can be used by the shadow rendering process.  The upshot is that we will only render the large pools of static common geometry that is visible, and only dynamic geometry that is associated with a visible r-tree volume, giving us the minimum amount to draw for the maximum performance.

The quad system is pretty similar to how Dark Imposter does things, but won't be required to regenerate the object view as the player moves. By pre-rendering eight fixed directions around the entity for our quad texturing, we will sacrifice that millimeter perfect quad visual for more speed, which is the objective at the moment. I must say though I really like the way Dark Imposter can produce a quad texture in real-time and regenerate it as you move around the object, and still manage to keep hundreds of FPS in the bag! If only I knew how they did it :)

Signing Off

All sounds impressive right :)  Well now I need to code it, and I am giving myself a few weeks to do this, but the end result should be entities as far as the eye can see, and every draw call that can be saved, saved. I can also use the r-tree spacial database to localize entity searches for things like physics interaction, sound playing, gun and missile ray-casting and to some degree entity logic.  Traditionally in FPSC Classic, we would have to step through EVERY entity in the level to perform our tasks, even if we only needed access to just one or a few close items.  Having a spacial database means our searches can be MUCH more targeted.

I am just finishing off some terrain quad reduction work, and then Thursday I will delve into the spacial functions of the BOOST library which promises some nice algorithms to make my database implementation go smoothly.  Once I have my spacial hierarchy in place, creating batches of common buffers from them and hiding them via GPU occlusion should be a little easier!

Thứ Ba, 12 tháng 11, 2013

Tuesday Treaker

Aha!

The sound of victory, perhaps, the sound of a small mystery uncovered, definitely. You will be disenchanted to learn I was not working on performance today. I was working on that annoying horrible vertex corruption we have been seeing in the gun and characters.  You will NEVER guess what I found inside my engine source code:

// sort into time ascending order at init stage
// very time consuming - so skip until see adverse affects!

//if ( pAnim ) SortAnimationDataByTime ( pAnim );

Amazing or what!  I must have been insanely tired or blind to comment this function out, not provide a date, to even think that I could get away with it and finally to lose all trace of every doing it and carrying on with my day.

I have not actually uncommented and compiled yet but I would bet a pint of Guinness that this is the bug. I vaguely remember removing this line to speed up character loading, madness!!

Other News

As yesterday's blog was a little small, I want to report on what happened on Monday in a little more detail. My PC decided to corrupt one file (just one) at the I/O level, which means a PHYSICAL hole in my data. All the usual suspects such as CHKSDK, Defrag, rename, remove and rebel where all in vain. I left the PC to do a full defrag overnight in the hopes it would move the offending file to some new sector which did not have a hole.  No joy.  First thing today I renamed my SVN repository, moved all the files over and left that one file in the old location locked and alone.  It's such a dangerous file that even if I cursor over it Windows File Explorer will crash!

As this is hardly a professional status quo, I have ordered a new replacement 128GB SSD drive and will have the miserable job of a full Windows re-install sometime this week. The good news is that the new drive is much faster than the old one and starting Windows a fresh always provides a nice fast OS (at least to begin with) to enjoy for a few months.  It arrives Wednesday, so that will probably be the day.

My Mission

I decided my mission for today was to either fix the vertex corruption or make the engine faster. Having tried reducing the shaders and noticing almost no performance improvement on my 650 Ti Boost (likely due to being CPU bound rather than GPU), and also failing to get the NSIGHT tool to do anything practical (after re-installing drivers, updating the tool, changing my WDD settings, fiddling with configurations and ultimately getting absolutely nothing from my attempt to profile my frames) I finally got to the line of code you saw above, which means in theory I have achieved something today!

Signing Off

A meeting was held today in other parts of TGC land, and it was agreed by all that performance and key bug fixes are to remain our top priority, and this attitude will continue until we have a solid and fast architecture.  I am the first to leap up and down when new features are busting to be coded, but I am also a realist, and I learn from my past mistakes, and unless we get the core of the engine correct there is little point in adding pretty tinsel over it.  

So today, we 'maybe' fix the animating corruption issue, tommorow I fix my HDD corruption issue and Thursday I dive headlong into shadow batching, which I can confidently predict will give me a 10fps speed up, which when you consider going from 26fps to 35fps, that's a substantial performance boost!