Lenovo x220 Hackintosh

14. October 2017 00:10 by Cameron in Hackintosh, Mac  //  Tags: , , ,   //   Comments

I recently acquired a Lenovo x220 for $60 to make a nearly 100% compatible hackintosh laptop. The x220 series allows for custom BIOS to be flashed to remove the Wifi whitelist enforced by Lenovo and I was able to install a Broadcom AC wireless card. Following this guide, http://x220.mcdonnelltech.com/, I now have a 95% compatible hackintosh laptop. The only thing that doesn't work is the SD card reader which isn't a big deal. I have installed macOS High Sierra on my 256GB mSATA SSD and I've set up the machine for iOS development. I plan to use this machine until I can buy a Macbook Pro or equivalent in the near future. Since the Lenovo x220 was released in 2011, I imagine I have about 2 years before this machine is obsoleted by the newest macOS. I should by that point be able to replace the laptop or buy an official Macbook Pro.

My Gaming/Entertainment Setup

4. March 2017 17:22 by Cameron in gaming  //  Tags: , , , , , , , ,   //   Comments

In my living room, I have an open back 3 shelf entertainment console. For my TV, I have an early 2016 model 58" 4K Vizio Smart TV. Unfortunately, this was before HDR was widely available and affordable. I'll need to wait some before upgrading to an HDR capable TV. For gaming/entertainment, I have an original Xbox One, an original PS4, a custom built mini PC, an XBMC hard modded original Xbox, and an Hitachi VCR. 

Xbox One

 I've had this Xbox since 2014 and have thoroughly enjoyed being able to play titles such as Gears of War 4 and RARE Replay. I've also been able to play several of my Xbox 360 games through the backwards compatibility introduced in 2015. I need to finish up Red Dead Redemption before Red Dead Redemption 2 is released this fall!

PS4

 I purchased the PS4 in the fall of 2013 with the anticipation of being able to play Uncharted 4 on PS4. I had to wait 3 years for the release of Uncharted 4, but it was well worth the wait. The game engine and graphics really demonstrate the hardware capabilities of the PS4. I knew the game would be great since I had played the original Uncharted trilogy on the PS3 a few years before.

 

For now, I am happy with sticking to my original PS4 and Xbox One as I am waiting for the HDR market to settle down some before investing in an HDR television. With Xbox Scorpio launching this fall, I'm interested to see how it stacks up to other 4K gaming platforms. I may stick with PC for 4K gaming however as I will be getting a GTX 1080 Ti for my desktop PC this summer.

Original Xbox

I purchased my original Xbox so that I use it as an emulation box. Prior to my original Xbox, I used to have a Sega Genesis, NES, SNES, and N64 and my entertainment console was quite full. I plan to get controller adapters so I can use the original controllers on my Xbox. A guy from Brazil makes a neat device called the Kade MiniConsole+ that works with several input and output configurations. It allows multiple plug and play configurations with classic game controllers and various systems. I outfitted my original Xbox with a 250GB HDD which is plenty for my Xbox game library and all of the emuXtras and mabdab edition emulators. My original Xbox connects to my TV over component video and connects to my sound system over optical audio.

Hitachi VCR

You may find it odd that I have a VCR in 2017, but I used it primarily to do a video conversion project for my family for Christmas. I converted all of our VHS tapes, Hi-8/Video-8 tapes, and 8mm film to digital video for long term storage. I meant to blog about it before now, but I'll make a blog post soon about the details of my video conversion project.

PC

I recently built a new desktop for use as a home media server and a gaming machine for the living room. Last year, I had purchased a Steam machine (Zotac NEN), but ultimately sold that and my Surface Pro 3 for my Alienware 15 R2. I liked the concept of a Steam machine, but it was very limited in terms of upgradability. The main issue I had was the limitation on the 2.5" hard drive height and not allowing a 3TB or greater HDD in the drive bay. It was manufactured with 9mm and smaller drives in mind. Also, the GPU was soldered onto the main board and non-upgradeable. I was looking for a more long term gaming solution. I purchased an Alienware 15 R2 and a Graphics Amplifier in hopes to solve this, but learned that there was a 10% performance decrease due to the card running at x4 speeds when running desktop graphics cards through the Graphics Amplifier. This ultimately led to my decision to build a desktop PC with support for leveraging the full bandwidth of PCIe 3.0 at x16 speeds.

Specs

Corsair Air 240 Case (Micro ATX)

ASROCK Z170M Extreme 4 Micro ATX motheboard

Intel i5 6600K 3.50GHz

Corsair H75 AIO Cooler

16GB of DDR4 RAM (1 x 16GB (I plan to upgrade to 32GB this summer))

ZOTAC GTX 1050 Ti (I plan to upgrade to a GTX 1080 Ti this summer)

Samsung 850 EVO 500GB M.2 SSD

4TB HDD (Internal)

4TB External HDD

Plex

I have converted all of my Blu-ray and DVD collection to mkv/mp4 files that are made available on Plex. Through Plex, I can watch my library through the Plex app on my PS4 or Xbox One. Plex supports transcoding video and audio on the fly to native formats supported by a Plex client. I've also synced some of my library to my iPad and iPhone for when I travel.

Gaming

I am very pleased with this machine as a gaming rig. With Steam Big Picture Mode, I can have a console like experience. It brings back the same fun I had with my ZOTAC NEN.

Graphics

I purchased a ZOTAC GTX 1050 Ti as I was happy with the build quality and reliability of my previous Steam machine. The ZOTAC GTX 1050 Ti crushes gaming at 1080p on high/ultra and 1440p on medium/high for most games. Some of my older games can even play at 4K! These games are several years old: DiRT 2 and DiRT 3 run buttery smooth at 4K on the GTX 1050 Ti. I am impressed by this card's capabilities being that is a sub $150 card. I was primarily looking to get decent framerates at 1080p until I can replace with a GTX 1080 Ti this summer. The bottom line: If you're looking for a budget low power card, please give the GTX 1050 Ti a look! You won't be disappointed!

Audio

I have a Sound Blaster Omni sound card with Dolby Digital LIVE! encoding for 5.1 surround sound. The sound card encodes any surround sound signal into a Dolby Digital compatible signal. In addition, I have an optical audio 2x4 matrix splitter to enable switching between different inputs and outputs in my entertainment center. One of the outputs is to my speakers and the other output is to my wireless 5.1 surround sound headset. I frequently switch between my PC, TV, and original Xbox as inputs. My TV acts as a switch for my consoles as both the Xbox One and PS4 have HDMI surround sound passthrough on the TV's optical out.

Input Devices

I rarely use my keyboard and mouse while playing games in the living room, but I use a variety of controllers to play the different games in my collection. I have a Steam controller and Xbox 360 controller for playing most of my games. I also have a GameCube adapter so I can play GameCube games through Dolphin. I have a Wii remote and nunchuck for playing Wii games in Dolphin.

Future

Capture Card

I'm considering getting a high-end capture card for capturing game play from analog and digital sources. I've been reading up on different options and the best option for doing both analog and digital appears to be the Micomsoft SC-512 PCIe card. One of the most attractive features of this card are recording at 60fps from any source (HDMI, VGA, DVI, component, composite) at up to 1080p. From my research so far, it looks like HDMI audio is preserved from its original source. This means that if the source is Dolby Digital 5.1, then the resulting capture will have Dolby Digital 5.1 audio. This is quite cool as most cards only support stereo or 2.1 audio. I'm not sure if YouTube has official support for 5.1 audio, but it would be neat to have surround sound in game recordings.

Blog now uses Let's Encrypt!

12. February 2017 20:34 by Cameron in   //  Tags: , , , , ,   //   Comments

I received a notification email from my SSL certificate authority that the certificate for www.gamerfootprint.com would expire soon. One of my co-workers told me about a free certificate authority called Let's Encrypt and I thought I'd give it a try. The process for installation is quite simple using a neat tool called Certify. After filling out the brief survey, you can download the application and register your email with Let's Encrypt and then begin creating certificates for IIS. It was as simple as selecting my IIS sites from a drop down and requesting a certificate. The certificates are downloaded and automatically installed. They have an expiration after 90 days, but you can easily request a new certificate after expiration. The Certify application is still in alpha, but they are actively working on adding new features. Hopefully we can have auto-renew as a feature some day soon! I'd pay for that feature for sure!

If you're running a web server, I highly recommend using Let's Encrypt because it is free and it's never been easier to install SSL certificates than before.

Note: If you have a domain and CNAME, you'll need to make sure you select the proper SSL certificate in the site bindings. I actually just have https://www.cameronjtinker.com/ bound to port 443 and then I've got http://cameronjtinker.com and http://www.cameronjtinker.com redirecting to the secure site.

Happy New Year 2017!

14. January 2017 00:31 by Cameron in Hackintosh, Mac  //  Tags: , ,   //   Comments

Hey guys! It has been way too long since my last blog post. I will make more of an effort to write at least one blog entry per month this year. One of my latest tech thing I've been doing is getting a new hackintosh laptop configured for iOS development.

I read recently on Reddit that the Thinkpad X220T is the perfect hackintosh laptop. After looking for a decent sub $200 X220 device, I couldn't find any that were reasonably priced. I did however find a $115 Lenovo X230T with 8GB of RAM, an i5 2.6GHz Ivy Bridge, and a 320GB 7200RPM HDD! The laptop came with no OS, but that was no problem as I planned to install macOS Sierra. The install went through without a hitch. I have nearly 100% compatibility on this hackbook minus the internal Wifi and multi-touch on the touch screen. These are minor things that I can ignore as the rest of the laptop works quite well. I hope to write a more detailed post about this hackbook and my setup procedure soon.

Please stay tuned!

Gamer Footprint Update

6. July 2015 08:02 by Cameron in Gamer Footprint  //  Tags: , , , , , ,   //   Comments

Hey guys, I know it's been a few months since the last update. Life has been pretty busy! I've been working hard on updating Gamer Footprint to use an on-demand approach for storing/retrieving games, achievements and trophies so that we don't have to rely on a schedule for updating player information. I will still probably have a scanner that runs on a schedule, but using an on-demand approach will allow for players to view the games they or their friends have played even if the scanner hasn't reached their account yet. This is very much a work in progress, but I feel this is the best direction going forward for near real-time updates. Using a linear scanning of all profiles will take a long time by itself and can probably be improved. However, with both the scanner and on-demand content updates, we'll get more frequent updates than before. Currently, I have a rough implementation of PlayStation game listing for any PSN ID, but there is a bug that duplicates games at the moment that I'm trying to work out. Because of this, I will wait to make it publicly known how to access PSN games for a PSN ID. Over the next few weeks I will hopefully be able to fix this bug and everyone will be able to enjoy viewing their PSN games on Gamer Footprint. The nice thing is that with the improvements I'm making, it will no longer require you to provide your username or password for either Xbox Live or PlayStation Network. You will simply be able to link your Gamertag and PSN ID to your account and our scanner will pick up your information. Please stay tuned for more updates soon!

Gamer Footprint April 2015 Update

25. April 2015 22:10 by Cameron in Durandal, Gamer Footprint  //  Tags: , , ,   //   Comments

Hey guys! Gamer Footprint has been receiving some cool updates lately! I have nearly finished the incremental scanner for scanning achievements/trophies and games played! I have just a couple of issues to work out before the scanner is more or less stable enough to run automatically without supervision. One of the main issues sparks from request throttling on Sony's side, but I think I can work around that by limiting the number of requests per minute. There are definitely some tweaks that can be done to improve upon the scanner, but the incremental portion has certainly made updating the timeline a ton faster.

Now that I've extracted a ton of data, I hope to get into more of the frontend development soon. I've setup several skeleton pages that I'll be filling in with content soon. I want to display the games that people play and the trophies/achievements available for each game. Content submission for a game page will be automatic based on the scanner and that will include publisher, developer, game description, release date, and several other available pieces of information. I've updated the site template so that it is more in line with flat UI design. I'm working on simplifying the top menu so that it will be easier to navigate and more mobile friendly. I'm not a huge fan of hamburger menus on mobile devices. I believe that content should be easily accessible by navigation menus and links on respective pages. 

I've been paying close attention to Rob Eisenberg's new framework, Aurelia, and I'm seriously giving it some thought as using the framework for future Gamer Footprint development. I love Durandal and I have been a huge supporter of the project, but Aurelia provides so much more with leveraging ES6 and ES7 concepts in the framework. Aurelia is relatively new with it being released in February 2015, but I have faith that it will be developed into a great product. I'll be experimenting with Aurelia over the coming months and preparing an upgrade path for Gamer Footprint. To the end user, the framework that I use on the frontend/backend means nothing. However, for performance, Aurelia seems to be stacking up quite well to other frontend frameworks on the market.

Please stay tuned!

Gamer Footprint Update and E1337 Entertainment Merger

Hey guys! I know it's been about a month since the last update, but things have been pretty busy lately. I am in the process of merging Gamer Footprint with E1337 Entertainment in order to bring tournaments to the Gamer Footprint community and statistics tracking to the E1337 Entertainment community. We are very excited to be joining forces! Over the next couple of months, we will be supporting tournament brackets, an interactive calendar for events, and statistics tracking for tournaments as part of the merger.

I managed to get automated builds working with TeamCity for both the development and production sites for Gamer Footprint. This was a fun task to finish. I setup a build for both environments to pull down latest sources from Bitbucket and walk through the build steps. This includes getting latest source, downloading Nuget packages, building the source, and minifying the JavaScript (on the production environment). Within a minute or two of pushing changes to Bitbucket, they are live on their respective environments.

Since the last update, I have implemented a few new features and fixed a couple of bugs:

I finished the first iteration of timelines on both the global scale and the personal scale. For now, the timelines are generated manually. I am still working on incremental updates to be run automatically. I expect to have that finished in the next couple of weeks. The timeline on both global and personal includes games played on PlayStation Network and Xbox Live and trophies or achievements earned on each network. I'm playing with the idea of subscribing or following another user to get updates to your subscription feed. This way, you don't have timeline events from everyone on Gamer Footprint. After some time, this could be quite large. As development progresses, I will add other events to the timeline including games/achievements from Steam, games/achievements from Battle.net, and potentially statistics from other games such as Halo 4.

I've been experimenting with getting push notifications setup via SignalR to send near realtime updates of gamers online presence information for Xbox Live and PlayStation Network. This is still in active development and is not considered finished. I will look at adding presence information from other networks such as Steam in the near future. Through push notifications, we can also send messages to all actively connected users on the website or specific users on the website. Messages can include maintenance notifications and estimated downtime, notifications of when your friends are online from a specific network, achievement/trophy unlocks, and much more.

The goal for Gamer Footprint and E1337 Entertainment is to provide a place for gamers to meet, collaborate, and participate in tournaments from around the world. Without Gamer Footprint, each gaming community is separated by platform or console. We want to bridge that gap and promote organic connections with players from a plethora of gaming backgrounds.

Please keep informed on the latest updates for both Gamer Footprint and E1337 Entertainment. The name/branding from the merger is still being decided, but we will keep you updated with the news.

Gamer Footprint Update

25. February 2015 11:32 by Cameron in Gamer Footprint  //  Tags: , , , ,   //   Comments

It's been a little while since I've updated everyone about Gamer Footprint. Development is picking up with me spending more time each week. I've been primarily working on the backend since the last update, but I have added some functionality to the fronted too.

Users can now register/login using their Google, Microsoft, Facebook, or Twitter account through OAuth2!

Xbox and PSN account linking is now functional and you can see your Xbox Live and PSN presence if you have linked your accounts. Currently, the profile page polls the server every 20 seconds to see if there is a change in presence information. However, I'm looking at making this more like a push notification service from the server through SignalR. The trick will be setting up individual user presence push notifications as it will rely on polling on the server and only pushing to the client when there is a change. I attempted this a little while back, but I couldn't get the connection management quite right to cancel running tasks. I will attempt the push notifications at some point, but it's not a high priority at the moment.

I'm reworking my data persistence and scanning algorithms to update a user's played games on PSN, Xbox and other platforms. Once that is complete, there will be a sort of timeline feature that incorporates all actions that a user wants displayed. Tracking of games played and achievements/trophies won't require a login, but if you want to share your online status with your Gamer Footprint friends and the community, due to privacy settings on each network, you'll need to link your accounts with Gamer Footprint. As of right now, there is no way to view games that you are actively online playing on PSN if you are not a friend of someone. This shouldn't be a big deal though. Rest assured, your information is safe and no passwords for Xbox or PSN accounts are ever stored on our server.

I will be posting more updates in the coming weeks so stay tuned!

Happy 2015 and Development Update

7. January 2015 10:52 by Cameron in Gamer Footprint  //  Tags: , , , ,   //   Comments

I know I'm a few days late, but Happy New Year 2015! It's crazy how time flies! Last year was a big year with me getting married, changing jobs, and moving to North Atlanta! They say that some of the largest life events are getting married, changing jobs, and moving. I did all three! My wife and I are finally beginning to settle in at our new jobs and our new area.

Gamer Footprint development is slow but still active. I added an API for gathering latest firmware versions for various consoles including 3DS, Wii U, Wii, PSP, PSVita, PS4, PS3, Xbox, Xbox 360, and Xbox One.

A full list of supported consoles can be found here: http://dev.gamerfootprint.com/api/firmware/consoles

A sample response for 3DS: http://dev.gamerfootprint.com/api/firmware/3ds

[{"versionNumber":"9.4.0-21 U","region":"US"},
 {"versionNumber":"9.4.0-21 J","region":"JP"},
 {"versionNumber":"9.4.0-21 E","region":"EU"}]

I plan to add caching to the firmware versions API as it can take a long time to pull back firmware versions for consoles such as the 3DS or Wii U that have different firmware version sources for each region. I'm working on a generic Neo4jClient repository for managing stored objects and relationships. This will help abstract the need to interface with Neo4jClient directly. There is an existing Neo4jClient repository, but it hasn't been updated for Neo4j 2.0. I plan to open source my work once complete so that others may benefit.

I've learned a ton on generics programming while writing the generic repository. While some may argue that the generic repository is an anti-pattern, it does have its use when paired with a non-generic repository for gathering/storing objects related to a specific model. I am working on abstracting Neo4jClient calls such that I can pass in Linq expressions directly to the query engine without explicitly having to access the query engine. I'll post some example code shortly to show some of my progress.

I'm continuing to develop the Durandal/Web API/OAuth implementation of Gamer Footprint and once I have the data layer established, I will begin writing more page functionality. The Gamer Footprint development site is very unfinished currently, but in the next couple of months, I expect to get basic account management and profiles finished. Please stay tuned!

OWIN Self-Hosted Test Server for Integration Testing of OData and Web API

A co-worker of mine and I were recently given a task to perform integration testing on OData and Web API services. You can view his posts on the subject in his series: Part 1, Part 2, and Part 3. Traditionally, one might mock the web requests and responses, but by using the TestServer found in Microsoft.Owin.Testing namespace, we can start an in-memory HTTP server for doing full integration tests. You can get the NuGet package here.

To start create a new Unit Testing project with MS Test and a new ASP.NET MVC / Web API project. In the ASP.NET MVC / Web API project install Web API 2.2 and Web API 2.2 for OData v4.0 and OData v1-3. For the unit test project, install Web API 2.2, Web API 2.2 for OData v4.0 and OData v1-3, Web API 2.2 OWIN Self Host, Web API 2.2 OWIN, and Microsoft.Owin.Testing. Below is a sample of a unit/integration test for setting up an OWIN test server for in-memory integration testing:

namespace SelfHosting.Test
{
    using Microsoft.Owin.Testing;
    using Microsoft.VisualStudio.TestTools.UnitTesting;
    using Owin;
    using SelfHosting.Test.Models;
    using System;
    using System.Linq;
    using System.Net.Http;
    using System.Threading.Tasks;
    using System.Web.Http;
    using System.Web.Http.Dispatcher;
    using System.Web.OData.Builder;
    using System.Web.OData.Extensions;
    using WebApp.Models;

    [TestClass]
    public class SelfHostingTest
    {
        protected TestServer server;

        [TestInitialize]
        public void Setup()
        {
            server = TestServer.Create(app =>
            {
                HttpConfiguration config = new HttpConfiguration();
                WebAppFacade.WebApiConfig.Register(config);
                app.UseWebApi(config);
            });
        }

        [TestCleanup]
        public void TearDown()
        {
            if (server != null)
                server.Dispose();
        }

        [TestMethod]
        public async Task TestODataMetaData()
        {
            HttpResponseMessage response = await server.CreateRequest("/odata/?$metadata").GetAsync();

            var result = await response.Content.ReadAsAsync<ODataMetaData>();

            Assert.IsTrue(result.value.Count > 0, "Unable to obtain meta data");
        }

        [TestMethod]
        public async Task TestWebApi()
        {
            HttpResponseMessage response = await server.CreateRequest("/api/values").GetAsync();

            var result = await response.Content.ReadAsStringAsync();

            Assert.AreEqual("\"Hello from foreign assembly!\"", result, "/api/values not configured correctly");
        }

        [TestMethod]
        public async Task TestODataPeople()
        {
            HttpResponseMessage response = await server.CreateRequest("/odata/People").GetAsync();

            var result = await response.Content.ReadAsAsync<ODataResponse<Person>>();

            Assert.AreEqual(result.value.Count, 3, "Expected 3 people to return from /odata/People");
        }
    }

}

The OData meta data is serialized into a POCO (Plain old C# object):

namespace SelfHosting.Test.Models
{
    using System.Collections.Generic;

    public class ODataMetaData
    {
        public string odatacontext { get; set; }
        public List<Value> value { get; set; }
    }

    public class Value
    {
        public string name { get; set; }
        public string kind { get; set; }
        public string url { get; set; }
    }
}

By using a generic ODataResponse class, we can deserialize our OData response into any POCO:

namespace SelfHosting.Test.Models
{
    using System.Collections.Generic;

    public class ODataResponse<T>
        where T : class, new()
    {
        public string odatacontext { get; set; }
        public List<T> value { get; set; }

        public ODataResponse()
        {

        }
    }
}

The beauty about using the TestServer is that it is self-contained and the HTTP server is inaccessible outside of the process. Once the tests complete, the server is shutdown. The WebApiConfig registered with the TestServer determines which controllers and routes to load for testing. No production code needs to be changed in order to test existing Web API and OData controllers. The only problem that I have found is that attribute routes don't seem to register correctly. Perhaps I have not found the correct method of registering the attribute routes for the TestServer.

Here is the Visual Studio 2013 solution with both a web project and a unit testing project:

SelfHostingUnitTest.zip (1.39 mb)

Month List

Tag cloud