Plex Setup Update

3. June 2019 22:33 by Cameron in   //  Tags: , , , ,   //   Comments

I now have all of my media in one place on my HP Proliant DL380 G7. It took a few days as the library consists of over 14TB of files.

When I tried to install the Quadro P400 into the server, the system would crash on the driver installation and force the server to reboot. This would render Windows effectively bricked and I fought with this about 3-4 times before giving up on getting it to work. Plan B is to use a GTX 1050 single slot card in its place. I have a GTX 750 installed in my colo server so this should work too. The GTX 1050 has more shader units than the Quadro P400 and it is HP branded so hopefully the fan control will be softer. The memory bandwidth bus is 128bit vs 64bit too. I will need to use the driver nvenc unlock to get its full potential, but according to the attached matrix, I should be able to get 14 simultaneous transcodes. This has yet to be benchmarked by myself, but the numbers look promising. I will post an update shortly after I've had a chance to test these findings.

nVidia NVENC NVDEC Matrix.pdf (427.36 kb)

New Plex Setup

30. May 2019 11:17 by Cameron in   //  Tags: ,   //   Comments

Last year, I bought a a Synlogy DS918+ and 4x4TB Seagate Ironwolf NAS drives. My plan was to use this NAS for Plex and upgrade the storage as needed. However, I am quickly running out of space (2TB free of 12TB) and the time that it takes to re-calculate parity when swapping drives is enormous. An option with this NAS is to add an external addon for an additional 5 drives, but this enclosure is $500 and too expensive for a proprietary solution. Another issue is the CPU power is limited when trying to stream to multiple devices or transcode to a mobile device's Plex library. In comes my new plan. Build a new server using old parts (HP enterprise hardware) which can act as a NAS and Plex server.

A few years ago, I purchased an HPE Proliant DL380 G6 from a local computer recycling company. This server is a capable machine with two Xeon X5650 CPUs, 72GB of RAM and a variety of HDD/SSDs. I bought this server to act as a web server/application server. I have this server deployed in a local colocation center so I can access the server anywhere in the world. Now, in 2019, I have returned to this same idea, but for a more dedicated purpose: NAS and Plex. This server will be hosted in my house so I can get the best speeds and not worry if the Internet is offline.

The new server I bought is also an HPE Proliant DL380, but it is G7. I bought it from the same computer recycling company as before. My initial setup includes two Xeon X5660 CPUs, 48GB of RAM, 1x240GB SSD, 5x5TB Seagate Barracuda HDDs, and an Nvidia Quadro P400. With this setup, I plan to install Windows Server 2019 DataCenter to the SSD and create a RAID 5 array of the 5TB drives, leaving me 20TB of usable space on the array. The Quadro P400 is mainly for entry level H.265 transcoding so I don't need to maintain both the 4K and 1080p copies of my video library. I may see about getting a Quadro P2000 or better later down the line if the need arises. With this server configuration, it has two drive bays capable of holding 8 2.5" drives. This will allow for expansion for two more 5x5TB RAID 5 arrays down the line.

For my Synology, I will move all of the video related items to my HP Proliant DL380 G7 and use the Synlogy as a personal file sharing NAS for family and friends.

My venture into CRT gaming

13. May 2019 12:44 by Cameron in gaming, SNES  //  Tags: , , , ,   //   Comments

Let me preface this by saying that getting into CRT gaming is not a cheap endeavor. If you go all RGB, you either need to mod the systems that don't support RGB or buy them already modded. Additionally, monitors are becoming higher demand and people aren't letting go of them as easily so expect higher prices in the coming years.

I recently started following the CRT gaming sub-Reddit and subsequently, I've begun acquiring various CRT displays. My first CRT was a Sony PVM-14M2U (14" RGB monitor) which I held onto for a few months before getting my Olympus OEV203 ex-medical 20" RGB monitor. Shortly after getting my original PVM, I wanted a bigger screen and ultimately found a 27" Sony Trinitron consumer TV which has component, S-Video, and composite inputs. The next monitor I found is an Amdek 310A whcih is an amber monochrome 12" monitor that works with the MDA/Hercules standard from vintage IBM PCs. Lastly, I picked up a Gateway 2000 15" Vivitron for retro 90's Windows gaming.

You may be wondering, you have all the monitors, but what about systems to connect them with? Well, I have you covered there! I invested in a 6 way SCART/Component switch to handle multiple consoles going to my PVM and 27" Trinitron. I have the following systems connected via this switch:

  • Super Nintendo
  • Nintendo N64
  • PS1
  • Sega Genesis
  • Sega Saturn
  • Sega Dreamcast
Note: I have an NES that I plan to RGB mod, but I haven't had the time to do this yet.
 
I have all of my RGB systems routed through a sync strike in order to record/stream from my Startech USB3HDCap capture card. For those unfamiliar with a sync strike, this device takes any sync signal and produces a CSYNC signal for either a monitor or a capture device. Some of the systems in my setup are sync on Luma (PS1 and N64 I think) so this was necessary for allowing captures. On the other end sync strike, I have a VGA distribution amplifier which I use to split the signal between my PVM (VGA to 4 BNC breakout cable) and my capture card.
 
I also have a Wii, GameCube, Original Xbox, and PS2, but these are not connected via RGB. All but my GameCube are component. The official GameCube component cables are too expensive so I went with S-Video.
 
For my retro computing, I have an old Compaq Deskpro 4000 with the following specs:
  • Pentium 233MMX
  • 256MB of RAM
  • 32GB storage with SD to IDE adapter (I left the original 2GB HDD in there since Compaqs are hard to change drives)
  • 48X IDE CD Drive
  • S3 Virge 2MB PCI graphics card
  • Creative SB Live SB0060
  • Hercules compatible ISA graphics card for monochrome displays
  • 2GB internal Jaz drive with PCI SCSI adapter
  • Windows 98 SE (DOS 7.1)
I have a few classic games installed at the moment such as Roller Coaster Tycoon Deluxe, Age of Empires 2, and Sim City 2000.
 
I don't know how much longer these tubes will last due to some of the longer hours on some of these. When seeking out equipment, I recommend going for Sony Trinitron or equivalent. In the case of my Gateway Vivitron SVGA monitor, it is a rebranded Sony Trinitron. Dell had some monitors that were rebranded Trinitrons also. Olympus is a rebranded Sony PVM so that's also a good alternative. For consumer sets, if you can find a Trinitron, go for it. The aperture grill vs. shadow mask makes a big difference in image quality in my opinion. You will be fine with many other sets out there too. Just be sure to check if it has component or S-Video inputs and that will be incredibly better than composite video. That being said, I am impressed with Sony's comb filter on composite video with my consumer set. Flat Trinitrons tend to have geometry issues due to CRTs not being designed for flat screens. Your mileage may vary, but the thing to remember is to test the screen before buying and buy local when possible. Lastly, if the seller is asking a ton of money for the screen, try to talk them down. Many Goodwills still have CRTs that you can find locally and you can check Craigslist, eBay, and the CRT gaming subreddit for local listings.

Ignoring TypeScript generated files from Git

22. April 2019 16:41 by Cameron in Git, javascript, TypeScript  //  Tags: , , , , , , ,   //   Comments

I recently started a project with TypeScript and found that the js and js.map files were being added to my repository. These files are automatically generated with each build of your TypeScript code so they're not needed in the repository. To remove these if you've already added them, run the following:

git rm src/**/**/*.js
git rm src/**/**/*.js.map

Following that, you can add these lines to your .gitignore to ensure they're not added back:

src/**/**/*.js
src/**/**/*.js.map

Configure Azure DevOps Pipelines for building WiX installers

5. April 2019 14:58 by Cameron in Continuous Integration, Programming  //  Tags: , , ,   //   Comments

Although the WiX installer toolset is installed on the default build agent in Azure DevOps, you need to tweak the VSBuild task some to make it work. Before I updated the VSBuild task, my builds were hanging and not finishing, wasting precious Azure DevOps Pipelines minutes. To remedy this, you need to edit your azure-pipelines.yml like so:

- task: VSBuild@1
  inputs:
    solution: '$(solution)'
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'
    msbuildArgs: '/p:RunWixToolsOutOfProc=true'

The important piece here is the msbuildArgs option on the VSBuild task. This allows the WiX tools to run outside of Visual Studio which is needed on the build agent. After your installer builds, you can then publish it as part of the build artifacts. Now as a result, you can always have up to date installers.

Migrate Large Bazaar Repositories to Git

20. March 2019 16:18 by Cameron in Source Control  //  Tags: , , ,   //   Comments

I recently was tasked with migrating my company's Bazaar repository to Git, but using Bazaar and Git for Windows wasn't working quite right. Every time I would try to do the migration, Bazaar would run out of memory. After further research, I found that this was due to Bazaar on Windows attempting to load all files into memory and this was exceeding the 32bit process memory limit of 2GB. It was important to maintain commit history since we will eventually be retiring Bazaar and we don't want to reference Bazaar's revision history.

In comes Windows Subsystem for Linux (WSL):

With Windows Subsystem for Linux, I was able to get Ubuntu 18.04 LTS installed and setup Bazaar and Git. When running Bazaar from Ubuntu 18.04 LTS in the WSL, the migration completed successfully. The Bazaar process had 64bit process memory available and didn't fail from being out of memory. Setting up the WSL took about 10 minutes and the full migration process took about 10 minutes. Within 20 minutes I was ready to push to our Git server after the migration.

Retro Video Game Homebrew

3. January 2019 07:24 by Cameron in   //  Tags: , , , , ,   //   Comments

Over the past few months, I've been watching various videos about the cool potential of older gen six/seven consoles for homebrew and modding. Most of my inspiration comes from ModernVintageGamer, but I have been following a few others as well (Adam Koralik and Dreamcastic for Sega Dreamcast). Since my last post, I've been collecting older gen six/seven consoles including the original Xbox, Xbox 360, PS2, PS3, and Sega Dreamcast. My end goal is to have a working development environment for each system and try to port some homebrew to each platform. I would like to make a simple 2D game engine that can run on all of these systems. I have a lot of work ahead of me, but it will be a good learning experience to get familiar with each system's architecture. From what I've read, the hardest console of these to develop for will probably be the PS2, but I will have to wait and see. Fortunately, all of these systems are old enough that there are mature homebrew communities and SDKs available.

Getting HD-DVD movies into Plex

28. September 2018 09:31 by Cameron in Home Theater  //  Tags: , , ,   //   Comments

In order to get HD-DVD movies into Plex you need the following:
1. External/Internal HD-DVD drive (an Xbox 360 HD-DVD is cheap on eBay ~$10-15)

2. MakeMKV 

3. AnyDVD HD/DVDFab Passkey (used for decrypting HD-DVD discs)

4. Clown BD (used for converting decrypted HD-DVD movies to M2TS)

5. Plenty of hard drive space as movies can be on average 10-20GB a piece. I use a Synology DS918+ NAS with 12TB of space.

Method 1:

The easiest approach to getting HD-DVD rips is using MakeMKV and selecting the main feature to rip. The process is similar to ripping a standard DVD or Blu-Ray/UHD disc. Just insert the disc in the drive and wait for MakeMKV to recognize the disc. Then load up the title list and select the main feature (the largest title in the list). In terms of audio, most discs have Dolby Digital Plus. Some have TrueHD which is lossless and preferred if available.

Note 1: Some people have mentioned that using an older version of MakeMKV (1.9.9) has had better success than more recent versions. You can try this if you have read errors on some of your discs.

Note 2: The mkvs produced by MakeMKV do not remove the telecine pull-down flag so you are left with a 29.97 fps video. This may not be an issue, but if you want to remove this flag, you will need to demux the mkv using eac3to and then remux (replacing the original video) using something like MKVToolnix.

Method 2:

If MakeMKV can't read the disc, then you may need to try with AnyDVD HD/DVDFab Passkey and rip the entire video disc to a temporary location on your hard drive. Then you can use Clown BD to demux/remux into an M2TS.

Method 3:

Use AnyDVD HD/DVDFab Passkey to take a full ISO backup of your disc. Then mount the disc and attempt Method 1 or Method 2 (point Clown BD to your virtual drive instead).

If these methods are unsuccessful, you might have disc rot and be unable to fully read the disc. In this case, you can evaluate if you want to replace the damaged discs with their Blu-Ray release.

General Notes:

Audio in Plex is a bit finicky depending on your client. If you use the PC client, then nearly any format is supported via direct play. However, if you use a client such as Xbox One or PS4, these might need to transcode Dolby Digital Plus to AAC before being compatible for playback. I found that on my Synology DS918+, this can be troublesome because transcoding DD+ to AAC requires a lot of processing power and sometimes the movie buffers while the audio transcodes.

The bottom line for movie enthusiasts:

Be smart about adding HD-DVDs to your library. Warner Brothers titles are more risky and it's a bit of a mixed bag with them due to disc rot. I have had success with a few titles so far, but I'm not holding my breath on all of my Warner Brothers titles. Also consider that since HD-DVD has been out of commission for the last 10 years, movies will only be as new as 2008. However, there are dozens of titles from that era that are worth collecting.

Why I bought an HD-DVD player in 2018

28. September 2018 09:15 by Cameron in Home Theater, server  //  Tags: , , , , ,   //   Comments

I recently stumbled upon a video on YouTube about adding various upgrades/addons to an Xbox 360. Among these upgrades included the Xbox 360 HD-DVD drive. Since HD-DVD lost the format war against  Blu-Ray back in 2008, it is considered a dead format and likely any movies on the platform are cheaper than their Blu-Ray counterpart. That's when I had the thought to purchase this drive and some HD-DVDs to expand my HD movie collection. Because the drive communicates over USB, it can be used with a PC and you can either play the discs directly or rip them to play them on your media server. I went to eBay and found a 46 movie lot with two HD-DVD drives for $60 which is a huge discount on movies.

I am now in the process of converting these movies with MakeMKV and AnyDVD HD/Clown BD. Most of the discs I've tried have worked with MakeMKV, but a few might need a different approach. I've been able to rip most of my library so far without issue, but some of the Warner Brothers titles are more difficult to rip due to their impending disc rot. Luckily, I was able to rip "A Clockwork Orange", "I am Legend", and "Constantine" so far and there doesn't appear to be any disc rot on these. I have about 10-15 more Warner Brothers titles to assess, but hopefully I can get through most of them. Disc rot is inevitable on Warner Brothers HD-DVDs so if you have any of these still, get them backed up ASAP!

I will be looking for single movies or smaller lots that don't contain Warner Brothers movies in the future. However, even if only 20-30 movies out of the 46 are good in this lot, it still is a heck of a deal on all of the movies.

HTPC Upgrade/Rebuild!

28. July 2018 22:55 by Cameron in Home Theater  //  Tags: , , , , , ,   //   Comments

I've upgraded and nearly completely rebuilt my HTPC for better long term usability and more moderate gaming. I swapped out the motherboard with an HP Elite 8300 motherboard, upgraded the RAM to 16GB, upgraded the CPU to an i7 3770, upgraded the SSD to 240GB, upgraded the hard drive to 4TB, and upgraded the GPU to a GTX 1050 Ti.

Originally, the upgrades were inspired by high CPU usage when playing H.265 content from Plex. I noticed this when playing back 4K H.265 content from Plex on my i5 2500. With the i7 3770, I am able to play back 4K H.265 movies with roughly 30-40% CPU usage. Adding a GTX 1050 Ti also helped with the decoding via its CUDA cores. The GT 1030 simply didn't have the power to help with H.265 so everything was CPU bound. I needed to swap motherboards because the HP 6200 Pro doesn't supply the needed 75W for the GTX 1050 Ti. Since I was swapping motherboards, I went for the HP Elite 8300 so I could get Ivy Bridge CPU support. Naturally, I needed to get the best i7 available. The larger SSD was just so I could install a couple of higher demanding games such as Rise of the Tomb Raider and Just Cause 3. I outfitted the rig with a 4TB drive to accommodate installing more Steam games and support recording TV shows/movies from Plex DVR.

As a result of these upgrades, the only thing original to the PC I first bought is the chassis. Now, I am happily using this machine as a more moderate gaming machine and my home theater workhorse. I will likely upgrade the low profile GPU as needed in the future, but the GTX 1050 Ti should suit me for a while.

Month List

Tag cloud