Gamer Footprint April 2015 Update

25. April 2015 22:10 by Cameron in Durandal, Gamer Footprint  //  Tags: , , ,   //   Comments

Hey guys! Gamer Footprint has been receiving some cool updates lately! I have nearly finished the incremental scanner for scanning achievements/trophies and games played! I have just a couple of issues to work out before the scanner is more or less stable enough to run automatically without supervision. One of the main issues sparks from request throttling on Sony's side, but I think I can work around that by limiting the number of requests per minute. There are definitely some tweaks that can be done to improve upon the scanner, but the incremental portion has certainly made updating the timeline a ton faster.

Now that I've extracted a ton of data, I hope to get into more of the frontend development soon. I've setup several skeleton pages that I'll be filling in with content soon. I want to display the games that people play and the trophies/achievements available for each game. Content submission for a game page will be automatic based on the scanner and that will include publisher, developer, game description, release date, and several other available pieces of information. I've updated the site template so that it is more in line with flat UI design. I'm working on simplifying the top menu so that it will be easier to navigate and more mobile friendly. I'm not a huge fan of hamburger menus on mobile devices. I believe that content should be easily accessible by navigation menus and links on respective pages. 

I've been paying close attention to Rob Eisenberg's new framework, Aurelia, and I'm seriously giving it some thought as using the framework for future Gamer Footprint development. I love Durandal and I have been a huge supporter of the project, but Aurelia provides so much more with leveraging ES6 and ES7 concepts in the framework. Aurelia is relatively new with it being released in February 2015, but I have faith that it will be developed into a great product. I'll be experimenting with Aurelia over the coming months and preparing an upgrade path for Gamer Footprint. To the end user, the framework that I use on the frontend/backend means nothing. However, for performance, Aurelia seems to be stacking up quite well to other frontend frameworks on the market.

Please stay tuned!

Gamer Footprint Update and E1337 Entertainment Merger

Hey guys! I know it's been about a month since the last update, but things have been pretty busy lately. I am in the process of merging Gamer Footprint with E1337 Entertainment in order to bring tournaments to the Gamer Footprint community and statistics tracking to the E1337 Entertainment community. We are very excited to be joining forces! Over the next couple of months, we will be supporting tournament brackets, an interactive calendar for events, and statistics tracking for tournaments as part of the merger.

I managed to get automated builds working with TeamCity for both the development and production sites for Gamer Footprint. This was a fun task to finish. I setup a build for both environments to pull down latest sources from Bitbucket and walk through the build steps. This includes getting latest source, downloading Nuget packages, building the source, and minifying the JavaScript (on the production environment). Within a minute or two of pushing changes to Bitbucket, they are live on their respective environments.

Since the last update, I have implemented a few new features and fixed a couple of bugs:

I finished the first iteration of timelines on both the global scale and the personal scale. For now, the timelines are generated manually. I am still working on incremental updates to be run automatically. I expect to have that finished in the next couple of weeks. The timeline on both global and personal includes games played on PlayStation Network and Xbox Live and trophies or achievements earned on each network. I'm playing with the idea of subscribing or following another user to get updates to your subscription feed. This way, you don't have timeline events from everyone on Gamer Footprint. After some time, this could be quite large. As development progresses, I will add other events to the timeline including games/achievements from Steam, games/achievements from Battle.net, and potentially statistics from other games such as Halo 4.

I've been experimenting with getting push notifications setup via SignalR to send near realtime updates of gamers online presence information for Xbox Live and PlayStation Network. This is still in active development and is not considered finished. I will look at adding presence information from other networks such as Steam in the near future. Through push notifications, we can also send messages to all actively connected users on the website or specific users on the website. Messages can include maintenance notifications and estimated downtime, notifications of when your friends are online from a specific network, achievement/trophy unlocks, and much more.

The goal for Gamer Footprint and E1337 Entertainment is to provide a place for gamers to meet, collaborate, and participate in tournaments from around the world. Without Gamer Footprint, each gaming community is separated by platform or console. We want to bridge that gap and promote organic connections with players from a plethora of gaming backgrounds.

Please keep informed on the latest updates for both Gamer Footprint and E1337 Entertainment. The name/branding from the merger is still being decided, but we will keep you updated with the news.

Gamer Footprint Update

25. February 2015 11:32 by Cameron in Gamer Footprint  //  Tags: , , , ,   //   Comments

It's been a little while since I've updated everyone about Gamer Footprint. Development is picking up with me spending more time each week. I've been primarily working on the backend since the last update, but I have added some functionality to the fronted too.

Users can now register/login using their Google, Microsoft, Facebook, or Twitter account through OAuth2!

Xbox and PSN account linking is now functional and you can see your Xbox Live and PSN presence if you have linked your accounts. Currently, the profile page polls the server every 20 seconds to see if there is a change in presence information. However, I'm looking at making this more like a push notification service from the server through SignalR. The trick will be setting up individual user presence push notifications as it will rely on polling on the server and only pushing to the client when there is a change. I attempted this a little while back, but I couldn't get the connection management quite right to cancel running tasks. I will attempt the push notifications at some point, but it's not a high priority at the moment.

I'm reworking my data persistence and scanning algorithms to update a user's played games on PSN, Xbox and other platforms. Once that is complete, there will be a sort of timeline feature that incorporates all actions that a user wants displayed. Tracking of games played and achievements/trophies won't require a login, but if you want to share your online status with your Gamer Footprint friends and the community, due to privacy settings on each network, you'll need to link your accounts with Gamer Footprint. As of right now, there is no way to view games that you are actively online playing on PSN if you are not a friend of someone. This shouldn't be a big deal though. Rest assured, your information is safe and no passwords for Xbox or PSN accounts are ever stored on our server.

I will be posting more updates in the coming weeks so stay tuned!

Happy 2015 and Development Update

7. January 2015 10:52 by Cameron in Gamer Footprint  //  Tags: , , , ,   //   Comments

I know I'm a few days late, but Happy New Year 2015! It's crazy how time flies! Last year was a big year with me getting married, changing jobs, and moving to North Atlanta! They say that some of the largest life events are getting married, changing jobs, and moving. I did all three! My wife and I are finally beginning to settle in at our new jobs and our new area.

Gamer Footprint development is slow but still active. I added an API for gathering latest firmware versions for various consoles including 3DS, Wii U, Wii, PSP, PSVita, PS4, PS3, Xbox, Xbox 360, and Xbox One.

A full list of supported consoles can be found here: http://dev.gamerfootprint.com/api/firmware/consoles

A sample response for 3DS: http://dev.gamerfootprint.com/api/firmware/3ds

[{"versionNumber":"9.4.0-21 U","region":"US"},
 {"versionNumber":"9.4.0-21 J","region":"JP"},
 {"versionNumber":"9.4.0-21 E","region":"EU"}]

I plan to add caching to the firmware versions API as it can take a long time to pull back firmware versions for consoles such as the 3DS or Wii U that have different firmware version sources for each region. I'm working on a generic Neo4jClient repository for managing stored objects and relationships. This will help abstract the need to interface with Neo4jClient directly. There is an existing Neo4jClient repository, but it hasn't been updated for Neo4j 2.0. I plan to open source my work once complete so that others may benefit.

I've learned a ton on generics programming while writing the generic repository. While some may argue that the generic repository is an anti-pattern, it does have its use when paired with a non-generic repository for gathering/storing objects related to a specific model. I am working on abstracting Neo4jClient calls such that I can pass in Linq expressions directly to the query engine without explicitly having to access the query engine. I'll post some example code shortly to show some of my progress.

I'm continuing to develop the Durandal/Web API/OAuth implementation of Gamer Footprint and once I have the data layer established, I will begin writing more page functionality. The Gamer Footprint development site is very unfinished currently, but in the next couple of months, I expect to get basic account management and profiles finished. Please stay tuned!

OWIN Self-Hosted Test Server for Integration Testing of OData and Web API

A co-worker of mine and I were recently given a task to perform integration testing on OData and Web API services. You can view his posts on the subject in his series: Part 1, Part 2, and Part 3. Traditionally, one might mock the web requests and responses, but by using the TestServer found in Microsoft.Owin.Testing namespace, we can start an in-memory HTTP server for doing full integration tests. You can get the NuGet package here.

To start create a new Unit Testing project with MS Test and a new ASP.NET MVC / Web API project. In the ASP.NET MVC / Web API project install Web API 2.2 and Web API 2.2 for OData v4.0 and OData v1-3. For the unit test project, install Web API 2.2, Web API 2.2 for OData v4.0 and OData v1-3, Web API 2.2 OWIN Self Host, Web API 2.2 OWIN, and Microsoft.Owin.Testing. Below is a sample of a unit/integration test for setting up an OWIN test server for in-memory integration testing:

namespace SelfHosting.Test
{
    using Microsoft.Owin.Testing;
    using Microsoft.VisualStudio.TestTools.UnitTesting;
    using Owin;
    using SelfHosting.Test.Models;
    using System;
    using System.Linq;
    using System.Net.Http;
    using System.Threading.Tasks;
    using System.Web.Http;
    using System.Web.Http.Dispatcher;
    using System.Web.OData.Builder;
    using System.Web.OData.Extensions;
    using WebApp.Models;

    [TestClass]
    public class SelfHostingTest
    {
        protected TestServer server;

        [TestInitialize]
        public void Setup()
        {
            server = TestServer.Create(app =>
            {
                HttpConfiguration config = new HttpConfiguration();
                WebAppFacade.WebApiConfig.Register(config);
                app.UseWebApi(config);
            });
        }

        [TestCleanup]
        public void TearDown()
        {
            if (server != null)
                server.Dispose();
        }

        [TestMethod]
        public async Task TestODataMetaData()
        {
            HttpResponseMessage response = await server.CreateRequest("/odata/?$metadata").GetAsync();

            var result = await response.Content.ReadAsAsync<ODataMetaData>();

            Assert.IsTrue(result.value.Count > 0, "Unable to obtain meta data");
        }

        [TestMethod]
        public async Task TestWebApi()
        {
            HttpResponseMessage response = await server.CreateRequest("/api/values").GetAsync();

            var result = await response.Content.ReadAsStringAsync();

            Assert.AreEqual("\"Hello from foreign assembly!\"", result, "/api/values not configured correctly");
        }

        [TestMethod]
        public async Task TestODataPeople()
        {
            HttpResponseMessage response = await server.CreateRequest("/odata/People").GetAsync();

            var result = await response.Content.ReadAsAsync<ODataResponse<Person>>();

            Assert.AreEqual(result.value.Count, 3, "Expected 3 people to return from /odata/People");
        }
    }

}

The OData meta data is serialized into a POCO (Plain old C# object):

namespace SelfHosting.Test.Models
{
    using System.Collections.Generic;

    public class ODataMetaData
    {
        public string odatacontext { get; set; }
        public List<Value> value { get; set; }
    }

    public class Value
    {
        public string name { get; set; }
        public string kind { get; set; }
        public string url { get; set; }
    }
}

By using a generic ODataResponse class, we can deserialize our OData response into any POCO:

namespace SelfHosting.Test.Models
{
    using System.Collections.Generic;

    public class ODataResponse<T>
        where T : class, new()
    {
        public string odatacontext { get; set; }
        public List<T> value { get; set; }

        public ODataResponse()
        {

        }
    }
}

The beauty about using the TestServer is that it is self-contained and the HTTP server is inaccessible outside of the process. Once the tests complete, the server is shutdown. The WebApiConfig registered with the TestServer determines which controllers and routes to load for testing. No production code needs to be changed in order to test existing Web API and OData controllers. The only problem that I have found is that attribute routes don't seem to register correctly. Perhaps I have not found the correct method of registering the attribute routes for the TestServer.

Here is the Visual Studio 2013 solution with both a web project and a unit testing project:

SelfHostingUnitTest.zip (1.39 mb)

TFS Conflicts Take Source Resolve

7. October 2014 14:17 by Cameron in Programming  //  Tags: , , ,   //   Comments

Today, I had to merge a bunch of items from one branch to another but had about 350+ conflicts and manually accepting the Take Source wasn't going to cut it. I found that if you open up the Visual Studio Command Prompt, you have access to the tf tool which allows you to work with TFS from a commandline interface. After changing to my solution's directory, the following command allowed me to merge my changes automatically by resolving to take the source branch:

tf resolve /auto:TakeTheirs

After about a minute or two, I was able to see that all conflicts were resolved and I checked in my changes. This is a huge time saver as it would have taken me several magnitudes longer to resolve them manually.

Life Update and Gamer Footprint Update

7. August 2014 09:03 by Cameron in   //  Tags: , , , , , , , ,   //   Comments

Hey guys, I know it's been a while since I've written a blog post. I've been busy with wedding planning and other life events! I'm still alive and kicking!

Gamer Footprint (GFP) continues development in the spare time that I have. It's making slow but sure progress. I've moved Gamer Footprint to use a single page application (SPA) approach. The previous iterations of GFP were based on strict ASP.NET MVC, but as of late, I've been on a single page application kick. :) The advantages to SPAs are excellent over traditional multi-page applications. With SPAs, you can load your application shell and then load/post content on demand. This gives a perceived faster and more responsive application. 

With GFP, I've been playing around with Web API 2 and Durandal. Web API 2 is Microsoft's take on RESTful API development. It's very robust and supports all types of HTTP calls as defined by the REST design principles. By using Durandal, I can use RequireJS, Knockout.js, and jQuery to build rich single page applications. This proves to be a good choice for the development of GFP because I already have experience in RequireJS, Knockout, and jQuery. If I had chosen to use AngularJS, it would have required to learn a completely new framework. While some might not worry about that, I've already been using Knockout in some projects at my day job, so naturally, it made sense to learn Durandal.

Visual Studio Update 3 and TypeScript external module compilation

7. August 2014 08:54 by Cameron in TypeScript  //  Tags: , , , , , , , ,   //   Comments

I recently updated to Visual Studio 2013 Update 3, but it appears as though the TypeScript compiler is a little more strict in external module compilation. If you receive an error message asking the --module flag to be set, you need to add this to your build configurations in your project:

  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
    <TypeScriptModuleKind>amd</TypeScriptModuleKind>
  </PropertyGroup>
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
    <TypeScriptModuleKind>amd</TypeScriptModuleKind>
  </PropertyGroup>

After adding the TypeScriptModule kind node to the project properties, the compiler can correctly compile external TypeScript modules!

Disqus now in effect for comments!

2. May 2014 14:02 by Cameron in Blog  //  Tags:   //   Comments

I don't know why it took so long to make the move to Disqus for the comment system, but I'm glad I did. There is one caveat though: all previous comments are now hidden. Moving forward, we will have much more flexibility for comments and hopefully less spam. You wouldn't believe how much spam I was getting with the standard comments system. Some spam bots were so badly written that they had malformed anchor tags embedded in their comments. It's extremely clear when comments are spam or when they're not.

Knockout Observable Array Performance Improvments

Today, I was working on a project that makes AJAX calls to an API and noticed that the application was very slow and unresponsive during the AJAX calls and ko updates. I checked that the requests weren't taking too long in Chrome's developer tools and found they were at max taking 300ms and minimum of 20ms to return. This was very odd behavior since the AJAX requests weren't blocking.

After some research, I found that when updating ko.observableArrays, you don't want to push each individual item to the array if you can help it. Each time an item is pushed to a ko.observableArray, it notifies its subscribers and depending on how many DOM elements you have on your page, that can be rather taxing. I had code like this:

$.ajax({
	url: 'api/Document/',
	method: 'GET',
	dataType: 'json',
	success: function(data) {
		Documents([])
		for(var i = 0; i < data.length; i++)
			Documents.push(data[i]);
	}
});

To fix this, you need to create temporary arrays to push your items to and then assign the ko.observableArray to the contents of the temporary array. Your code should look more like this:

$.ajax({
	url: 'api/Document/',
	method: 'GET',
	dataType: 'json',
	success: function(data) {
		var documents = [];
		for(var i = 0; i < data.length; i++)
			documents.push(data[i]);	

		Documents(documents);
	}
});

Month List

Tag cloud