Category Archives: Tech Tips

Universal Game Translator – Using Google’s Cloud Vision API to live-translate Japanese games played on original consoles (try it yourself!)

Why I wanted a “translate anything on the screen” button

I’m a retro gaming nut.  I love consuming books, blogs, and podcasts about gaming history.  The cherry on top is being able to experience the identical game, bit for bit, on original hardware.  It’s like time traveling to the 80s.

Living in Japan means it’s quite hard to get my hands on certain things (good luck finding a local Speccy or Apple IIe for sale) but easy and cheap to score retro Japanese games.

Yahoo Auction is kind of the ebay of Japan.  There are great deals around if you know how to search for ’em.  I get a kick out of going through old random games, I have boxes and boxes of them.  It’s a horrible hobby for someone living in a tiny apartment.

Example haul – I got everything in this picture for $25 US! Well, plus another $11 for shipping.

There is one obvious problem, however

It’s all in Japanese.  Despite living here over fifteen years, my Japanese reading skills are not great. (don’t judge me!) I messed around with using Google Translate on my phone to help out, but that’s annoying and slow to try to use for games.

Why isn’t there a Google Translate for the PC?!

I tried a couple utilities out there that might have worked for at least emulator content on the desktop, but they all had problems.  Font issues, weak OCR, and nothing built to work on an agnostic HDMI signal so I could do live translation while playing on real game consoles.

So I wrote something to do the job called UGT (Universal Game Translator) – you can download it near the bottom of this post if you want to try it.

Here’s what it does:

  • Snaps a picture from the HDMI signal, sends it to google to be analyzed for text in any language
  • Studies the layout and decides which text is dialog and which bits should be translated “line by line”
  • Overlays the frozen frame and translations over the gameplay HDMI signal
  • Allows copy/pasting the original language or looking up a kanji by clicking on it
  • Can translate any language to any language without needing any local data as Google is doing all the work, can handle rendering Japanese, Chinese, Korean, etc  (The font I used is this one)
  • Controlled by hotkeys (desktop mode) or a control pad (capture mode, this is where I’m playing on a real console but have a second PC controller to control the translation stuff)

In the video above, you’ll notice some translated text is white and some is green.  The green text means it is being treated as “dialog” using its weighting system to decide what is/isn’t dialog.

If a section isn’t determined to be dialog, “Line by line” is used.  For example, options on a menu shouldn’t be translated all together (Run Attack Use Item), but little pieces separately like “Run”, “Attack”, “Use item” and overlaid exactly over the original positions.  If translated as dialog, it would look and read very badly.

Here are how my physical cables/boxes are setup for “camera mode”. (Not required, desktop mode doesn’t need any of this, but I’ll talk about that later)

Happy with how merging two video signals worked with a Roland V-02HD on the PlayStep project, I used a similar method here too.  I’m doing luma keying instead of chroma as I can’t really avoid green here. I modify the captured image slightly so the luma is high enough to not be transparent in the overlay. (of course the non-modified version is sent to Google)

This setup uses the windows camera interface to pull HDMI video (using Escapi by Jari Kompa) to create screenshots that it sends to Google.  I’m using an Elgato Cam Link for the HDMI input.

Anyway, for 99.99999999% of people this is setup is overkill as they are probably just using an emulator on the same computer so I threw in a “desktop mode” that just lets you use hotkeys (default is  Ctrl-F12) to translate the active Window. It’s just like having Google Translate on your PC.

Here’s desktop mode in action, translating a JRPG being played on a PC Engine/TurboGrafx 16 via emulation. It shows how you can copy/paste the recognized text if want as well, useful for kanji study, or getting text read to you.  You can click a kanji in the game to look it up as well.

Try it yourself

Before you download:

  • All machine translation is HORRIBLE – this is no way replaces the work of real translators, it’s just (slightly) better than nothing and can stop you from choosing “erase all data” instead of “continue game” or whatever
  • You need to rename config_template.txt to config.txt and edit it
  • Specifically, you need to enter your Google Vision API key.  This is a hassle but it’s how Google stops people from abusing their service
  • Also, you’ll need to enable the Translation API
  • Google charges money for using their services after you hit a certain limit. I’ve never actually had to pay anything, but be careful.
  • This is not polished software and should be considered experimental meant for computer savvy users
  • Privacy warning: Every time you translate you’re sending the image to google to analyze.  This also could mean a lot of bandwidth is used, depending how many times you click the translate button.  Ctrl-12 sends the active window only, Ctrl-11 translates your entire desktop.
  • I got bad results with older consoles (NES, Sega Master System, SNES, Genesis), especially games that are only hiragana and no kanji. PC Engine, Saturn, Dreamcast, Neo-Geo, Playstation, etc worked better as they have sharper fonts with full kanji usually.
  • Some game fonts work better than others
  • The config.txt has a lot of options, each one is documented inside that file
  • I’m hopeful that the OCR and translations will improve on Google’s end over time, the nice thing about this setup is the app doesn’t need to be updated to take advantage of those improvements or even additional languages that are later supported

After a translation is being displayed, you can hit ? to show additional options

5/8/2019 – V0.50 Beta – first public release, experimental
5/13/2019 – V0.51 Beta – Added S to screenshot, better error checking/reporting if translation API isn’t enabled for the Google API key, minor changes that should offer improved translations

Download Universal Game Translator for Windows (64-bit) (Binary code signed by Robinson Technologies)

Conclusion and the future

Some possible upgrades:

  • Built-in text to speech on the original dialog (well, by built in I mean using Google’s text to speech API and playing it in UGT, easier than the copy and paste method possible now)
  • A built in Kanji lookup also might be nice,  Jim Breen’s dictionary data could work for this.
  • My first tests used Tesseract to do the OCR locally, but without additional dataset training it appeared to not work so hot out of the box compared to results from Google’s Cloud Vision.  (They use a modified Tesseract?  Not sure)  It might be a nice option for those who want to cut down on bandwidth usage or reliance on Google.  Although the translations themselves would still be an issue…

I like the idea of old untranslated games being playable in any language, in fact, I went looking for famous non-Japanese games that have never had an English translation and really had a hard time finding any, especially on console.  If anyone knows of any I could test with, please let me know.

Also, even though my needs focus on Japanese->English, keep in mind this also works to translate English (or 36 other languages that Google supports OCR with) to over 100 target languages.

Test showing English being translated to many other languages in an awesome game called Growtopia

Spawning annoying black holes in Fortnite to force a kid to exercise

Sure, there are ways to get exercise while gaming. Virtual reality and music games like Dance Dance Revolution come to mind.

But that’s all worthless when your kid just wants to play Fortnite.

Behold, the PlayStep!

This thing forces him to work up a sweat. This post details what methods I used and issues I had making it.  (Github source code for the program that runs on the Pi here for anybody who wants to make one)

Building a screen blanker connected to exercise isn’t a new idea (see the end of this post for related links I found) but my version does have some novel features:

  • Dynamically modifies the video and audio of the game’s HDMI signal to do things like partially obscure the screen in random ways
  • Uses an energy bank so you can save up game time.  This means you can madly pedal in the lobby and still sit in a chair during the critical parts of Fortnite

I first built a cheap version (~$120 in parts).  It just blanks the screen when you’re out of energy, and uses an LCD screen to show energy left.

I then did a better but more expensive way (~$700 in parts) but it’s a lot cooler.

The expensive version with HDMI in/out, the “enclosure” is a plastic basket thing from the dollar store

Things both ways have in common:

  • Use a Raspberry Pi 3B+ (a $40 computer with hardware GLES acceleration) with the Retropie distro – I start with it because its mouse/keyboard/GLES/SDL works out of the box with Proton SDK where normal Raspian requires tweaking/compiling some things
  • Use Proton SDK for for the app base (allows me to design/test on Windows, handles abstraction for many platforms so I can write once but run everywhere)
  • Use hall effect sensors to detect the pedal down position on each pedal via the Pi’s GPIO, this way a kid can’t cheat, he’s forced to move the full range of the stepper
  • The sensors are placed on a stepper exerciser.  I used a USB connector for the wiring so I could unplug/replace it later if I wanted to setup a different exercise machine, like if I ever got a stationary bike.

Yes, I’m about to duct tape an electrical taped sensor to a pencil that has been zip-tied in place. What? I never said I was pro

A note on using USB cables for wires and my idiocy

Each hall effect sensor requires three wires.  We have two sensors.  So we need to run six wires from the Pi GPIO pins?  WRONG! We only need four because the power and ground can be shared between them.

So I thought hey, I’ll use USB cables and connectors laying around as they have four wires in them. (until we get to USB 3+ cables, but ignore that)

Then I thought, if I could find a simple USB Y splitter, it will be easier to share the power/ground with the two sensors . (I’m not actually using this as a USB connection, it’s just so I can use the wire and handy plugs)

Wow, I found this for cheap on Amazon:

Perfect!  A lowly USB splitter that I’m sure just has no fancy electronics hidden inside

So I partially wired it up but when testing found that the middle pins had no continuity.  Can you guess why?

WHAT THE HELL IS THIS INSIDE THE CABLE?!

It’s got a hub or something hidden in the connector.  I never plugged it into an actual PC or I might have noticed.  No wonder it didn’t work.  I removed the electronics part (it was a horror, I shouldn’t be allowed near soldering irons) and it worked as expected. Moral of the story is, I’m dumb, and don’t trust USB splitters to just split the wires.

The cheap way (just screen blanking with LCD panel)

My “cheap” way ignores rendering anything graphical (It doesn’t output any HDMI itself) and just shows a single “energy count” number on an LCD screen.  When it gets low, the game’s HDMI signal will be completely shut off until it goes positive again.  In the video above I’m using little buttons to test with instead of the stepper.

To help the user notice the screen is about to shut off it makes a beeping noise as the counter nears zero.

I suggest never testing this at an airport, can’t stress that enough really.

So how can a Raspberry Pi turn on/off the game’s HDMI signal?

A splitter with no USB power = a dead signal

This is hacky but it works – I took an old 1X2 HDMI splitter and powered it from one of the Pi’s USB ports.  (lots of electronics these days use a USB plug for power)

I only use one of the outputs on the splitter as I don’t really need any splitting done.

It’s possible to kill the power on a specific Pi USB port using a utility called uhubctrl.

So when the player is out of “energy”, I kill the USB port powering the HDMI splitter by having my C++ code run a system command of:

./uhubctl -a off -p 2

And because the HDMI splitter is now unpowered, the signal dies killing the game screen.

After turning the USB port back on (replacing “off” with “on”) it will power up and start processing the HDMI signal again.  Originally I was using the Pi to turn on/off an entire AC outlet but that seemed like overkill – I was thinking maybe turning off an entire TV or something, but meh.

So the big downsize of this method is it takes up to 5 seconds for the HDMI splitter to turn back on, and your TV to recognize the signal again.  It works but… not optimal.  Also, in my case I don’t really have a good place to put the LCD screen or speaker for the beeping. (might make more sense on a stationary bike instead of a stepper)

Alternate way to disable the HDMI signal : Instead of this no-wiring hack, maybe instead run it through an HDMI cable but put one of the pins into a relay to turn that pin on/off?  Might be the same effect but cheaper and simpler.. although, which pin?!

The expensive but better way (offers more options with images and audio)

There isn’t enough drama in simply turning the HDMI signal on/off – wouldn’t it be better if holes started spawning randomly over your actual gameplay and you had to pedal to remove them as your screen became increasingly obscured?!  There are a million options, really.

The Raspberry Pi can generate the graphics (thanks GLES) and audio but we need a way to overlay its HDMI output over the game’s HDMI signal with no noticeable latency costs at 60fp.

This is known as a chroma key effect.  (Side note: I once bought a $5,000 video mixer in the 90s so I could do live-effects like this, a WJ-MX 50.  Just saw one on ebay for $100, damn it’s big)

The V-02HD. A lot cheaper than $5,000.

The cheapest stand-alone way I found to do it these days is a Roland V-02HD video switcher. (I bought it for $664 USD from Amazon Japan)

Does anybody know a better/cheaper alternative? If I could figure out a no latency way to overlay with an alpha channel instead of just chroma that would really be ideal.

It’s pricey, but it works perfectly.  It has the following features of interest:

  • Remembers all settings when powered on, including chroma key mode and color/sensitivity
  • Can disable auto-detection so inputs 1 and 2 are always the same even if input is turned off
  • Can disable all buttons/levers on it so accidental changes won’t happen (we don’t need them active, it’s just a black box to us)
  • It’s pretty small for a video switcher
  • Mixes audio into the HDMI signal from both inputs
  • No noticeable latency

Although I didn’t need or use it, it’s worth noting that it can show up as a USB MIDI device and be controlled via MIDI signals.  I did not need those features but that’s pretty cool, assuming the Pi could work with it, you could do transitions between inputs or enable/disable effects.

The Software

With no color keying, this is what the raw Pi video out looks like

The software to control things uses Proton SDK with its SDL2 backend and WiringPi for the GPIO to read from the sensors.  It’s modified from the RTBareBones example.

It uses a config.txt file to adjust a couple things:

max_energy|600
energy_timer|1000
energy_per_move|7

PlayStep Source code on github

Here’s some info on how to compile Proton examples.

To allow the Pi to correctly output 1080P HDMI even if the switcher hasn’t booted up yet, I edited the /boot/config.txt  and set:

hdmi_force_hotplug=1
hdmi_drive=2

To fix remove the unnecessary border I also set:

disable_overscan=1

Final thoughts

Might be fun to simply design Pi powered pedal games that use the stepper as a controller.  You could then output straight to a TV or TFT screen without worrying about the spendy chroma-keying solution.

I mean, sure, my kid would refuse to play it, but it could be a funny thing to show at a meet-up or something.

Related things to check out

  • Cycflix: Exercise Powered Entertainment – Uses a laptop to pause netflix if you don’t pedal fast enough.  He connected an arduino directly to the existing stationary bike electronics to measure pedaling, smart.
  • No TV unless you exercise! – Arduino mounted on a stationary bike cuts RCA signal via a relay if you don’t pedal enough.  Uses a black/white detector for movement rather than hall effect sensors.
  • TV Pedaler – A commercial product that blanks screen if you don’t pedal enough that is still being sold? The website and product seem really old (no HDMI support) but they accept Paypal and the creator posted here  a few years ago about his 1999 patent and warned about “copying”.  Hrmph.  His patent covers a bunch of random ideas that his machine doesn’t use at all. Patents like this are dumb, good thing it says “Application status is Expired – Fee Related” I guess.
  • The EnterTRAINER – This defunct commercial device is basically a TV remote control with a heart monitor you strap to your chest.  Controls volume and TV power if your heart rate goes too low. Its hilarious infomercial was posted in one of the reviews.
  • The 123GoTV KidExerciser – Ancient commercial product that lets you use your own bike in the house to blank the TV if not pedalled fast enough.  Company seems gone now.

 

PaperCart – Make an Atari 2600 that plays QR codes

What is it about the Atari 2600?

It was the summer of 1983 at Jeff Mccall’s slumber party when I saw my first game console.

Crowded around the small TV we gawked at the thing – an Atari VCS.

The seven of us took turnst. Passing the joystick around like a sacred relic we navigated Pitfall Harry over hazardous lakes, crocodiles and scorpions.

One by one the other kids fell asleep.  Having no need of such mortal frivolity, I played Pitfall all night!

I fainted in the street the next day due to sleep deprivation.  Worth it.

The challenge

It’s kind of mind-blowing that games that originally sold for over $30 ($70+ in 2018 money) can now be completely stored in a QR code on a small piece of paper.

As a poignant visual metaphor for showing my kids how much technology has changed, I decided to create a Raspberry Pi based Atari that accepts “paper carts” of actual Atari 2600 games.

The requirements for my “PaperCart” Atari VCS:

  • Must use the real QR code format, no cheating by tweaking the format into something a standard QR reader couldn’t read
  • 100% of the game data must actually be read from the QR code, no game roms can be stored in the console, no cheating by just doing a look-up or something
  • Runs on a Raspberry Pi + Picamera with all open source software (well, except the game roms…)
  • Can convert rom files to .html QR codes on the command line, sort if need this or we’ll have nothing to print and read later
  • Easy enough to use that a kid can insert and remove the “paper carts” and the games will start and stop like you would expect a console to do
  • Standard HDMI out for the video and audio, USB controller to play

All about QR codes

The QR in QR Code stands for Quick Response.  It’s a kind of 2d barcode that was invented by a Japanese company named Denso Wave in 1994.  They put it into the public domain right from the get go, so it’s used a lot of places in a lot of ways.

QR codes have a secret power – they use something called Reed-Solomon error correction.  It has the amazing ability to fill in missing parts using extra parity data.  More parity data, more missing data can be reconstructed.  Not certain parts, ANY OF THE PARTS.  I know, right?

Reed-Solomon is also used in CDs, DVDS and Blu-ray, that’s why a scratched disc can still work.

Remember those .par files on Usenet you’d use when you were downloading a bunch of stuff in chunks?  Yep, parchives were based on Reed Solomon.

I hid a fun Atari fact in this code.

I’ve encoded some text in the above QR code with error correction set to Level H (High), which means up to 30% can be missing and you can STILL read it!

Go ahead, block some of it with your fingers, put your phone in camera mode and point it at the QR code above.  Does it work?  That’s the Reed-Solomon stuff kicking in.

QR codes automatically jump to larger sizes to encode more data – from version 1 to version 40.

Can you find your way out of this maze?  Does your brain hurt yet? Hope no one took that seriously and actually tried.

Above is a version 40, the most dense version.  My iPhone is able to read this one right off the screen too.  If you have problems, you can try zooming into the page a bit maybe.

This is the first 2900 characters of Alice In Wonderland.  We can store a max of 2,953 full bytes.  A byte is 8 “yes or no” bits.  With that, you can store a number between 0 and 255. 

Because text doesn’t need a whole byte, there are smarter ways to store it which would allow us to pack in much more than we did here –  but let’s ignore that as we’re only interested in binary data.

If I show the QR code too clearly, I might be enabling rom piracy and get in trouble. Weird, right?

This game (Stampede) has 2,048 bytes (2K) of rom data so it easily fits inside a single QR code.

Other Activision classics like Fishing Derby and Freeway are also 2K games but Pitfall! is 4K game.  Using gzip compression saves us nearly 20% but it’s still a bit too big to fit in a single QR code. To work around this I’ve added a “Side B” to the other side of the Pitfall! card.  Cart. Whatever it is.

My paper cart format stores some metadata so the reader can know how many QR codes are needed for the complete game, as well as if the data is for the same game or not by storing a rom hash in each piece.

Emulating a 2600 on a Raspberry Pi 3

I started with latest Retro Pi image and put that on a micro SD card.  RetroPi has an Atari 2600 emulator out of the box that can be directly run from the command line like this:

/opt/retropie/supplementary/runcommand/runcommand.sh 0 _SYS_ atari2600 <rom file data>

So now I just needed to write some software that will monitor what the Pi camera sees, read QR codes, notice when no QR code is present or it has changed and start/stop the emulator as appropriate.

Writing the software – PaperCart

Naturally I chose Proton SDK as the base as it handles most of what I need already.  For the QR reading, I use zbar, and for the webcam reading I use OpenCV and optionally raspicam instead. (no need for OpenCV on the Raspberry Pi linux build)  I put it on github here.

The PaperCart binary can also be used from the command line to create QR codes from rom files. (It uses QR-Code-generator)

RTPaperCart.exe -convert myrom.a26

or on the raspberry:

RTPaperCart -convert myrom.a26

It will generate myrom_1_of_1.html or if multiple QR codes are needed, a myrom_1_of_2.html and myrom_2_of_2.html and so on.  I opened in the web browser, cut and pasted them into photoshop, scaled them down (disable antialiasing!) to the correct size and printed them.

A quick note about zbar and decoding binary data in a QR code

If you want binary data to look exactly as it went in (and who wouldn’t?!), you need to do a little processing on it after zbar returns it with iconv.  Here is that magical function for any future googlers:

string FixQRBinaryDataEncoding(string input)
{
	iconv_t cd = iconv_open("ISO-8859-1", "UTF-8");
	if (cd == (iconv_t)-1)
	{
		LogMsg("iconv_open failed!");
		return "";
	}
	int buffSize = (int)input.length() * 2;

	string output;
	char *pOutputBuf = new char[buffSize]; //plenty of space
	size_t outbytes = buffSize;
	size_t inbytes = input.length();

	char *pOutPtr = pOutputBuf;
	char *pSrcPtr = &input.at(0);

	do {
		if (iconv(cd, RT_ICONV_CAST &pSrcPtr, &inbytes, &pOutPtr, &outbytes) == (size_t)-1)
		{
			LogMsg("iconv failed!");
			SAFE_DELETE_ARRAY(pOutputBuf);
			return "";
		}
	} while (inbytes > 0 && outbytes > 0);
	
	iconv_close(cd);
	int finalOutputByteSize = (int)buffSize-(int)outbytes;
	string temp;
	temp.resize(finalOutputByteSize);
	memcpy((void*)temp.c_str(), pOutputBuf, finalOutputByteSize);
	SAFE_DELETE_ARRAY(pOutputBuf);

	return temp;
}

Want to make your own?

It’s pretty straight forward if you’re comfortable with linux and Raspberry Pi stuff. Here are instructions to set it up and download/compile the necessary software.

(If you really wanted, it’s also possible to do this on Windows, more help on setting up Proton on Windows here, you’d also need to OpenCV libs and Visual Studio in  that case)

  • Write Retropie image to a microSD card, put it in your pi
  • Make sure the Atari 2600 emulator works in Retropie, setup a gamepad or something
  • Enable SSH logins so you can ssh in, I guess you don’t have to, but it makes cutting and pasting instructions below a lot easier

Open a shell to the Raspberry Pi (as user “pi” probably) and install RTPaperCart and the stuff it needs starting with:

sudo apt-get install libiconv-hook-dev libzbar-dev

Now we’ll install and compile raspicam, a lib to control the camera with.

Note: It acts a little weird, possibly because it’s using outdated MMAL stuff? In any case, it works “enough” but some fancier modes like binning didn’t seem to do anything.

cd ~
git clone https://github.com/cedricve/raspicam
cd raspicam;mkdir build;cd build
cmake ..
sudo make install
sudo ldconfig

Before we can build RTPaperCart, we’ll need Proton SDK:

cd ~
git clone https://github.com/SethRobinson/proton.git

Build Proton’s RTPack tool:

cd ~/proton/RTPack/linux
sh linux_compile.sh

Download and build RTPaperCart:

cd ~/proton
git clone https://github.com/SethRobinson/RTPaperCart.git
cd ~/proton/RTPaperCart/linux
sh linux_compile.sh

Build the media for it.  It converts the images to .rttex format.

.rttex is a Proton wrapper for many kinds of images.

cd ~/proton/RTPaperCart/media
sh update_media.sh

Now you’re ready to run the software (note: pkill Emulation Station first if that’s running):

cd ~/proton/RTPaperCart/bin
sh ./run_app.sh

You might see errors if your camera isn’t available. To enable your camera, plug in a USB one or install a Picamera and use “sudo raspi-config” to enable it under “Interfacing options“. (don’t forget to reboot)

If things work, you’ll see what your camera is seeing on your screen and if a QR code is read, the screen should go blank as it shells to run the atari emulator.

You can point your camera at a QR code on the screen and it will probably work, or go the extra mile and print paper versions because they are fun.  You don’t have to laminate them like I did, but that does help them feel more sturdy.

You might need to twist your camera lens to adjust the focus. 

I setup mine to automatically run when the Pi boots (and not Emulation Station) so it works very much like a console.

Running RTPaperCart /? will give a list of optional command line options like this:

-w <capture width>
-h <capture height>
-fps <capture fps>
-backgroundfps <background capture fps
-convert <filename to be converted to a QR code. rtpack and html will be created>

3D Printing the stand

I sort of imagined designing a stylish 2600 themed case with a slot for the paper cart and fully enclosing the Pi, but that would take skill and also require some kind of light inside so the QR could be read. 

So instead I did the minimum – a thing to hold the PI, camera, and easel where you insert the QR code paper.

I used Fusion 360 and designed the stand parametrically so you can fiddle with values to change sizes pretty easily.  The modules are designed to snap together, no screws needed.

You can download the Fusion 360 project here, the download button allows you to choose additional formats too. 

You need to kind of use common sense and print with supports where it looks like you need them.

Conclusion

So that’s great, but you’d like to store Grand Theft Auto 5 as QR codes because they are so convenient?

Let’s see, 70 gigabytes.  No problem. To convert that you’ll just need about 25 million QR codes.  You might want to order some extra ink now.

At one code per paper, the stack would reach a mile and a half into the sky.

If you got this far, you must also be a connoisseur of gaming history and old hardware.  Check these out too then:

They Create Worlds (Podcast on gaming history, no fluff)
Matt Chat (Interviews and info about old games in visual form)
The Retro Hour (Podcast with retro gaming interviews and news)
Atari 5200 Multi-ROM Cartridge Using Raspberry Pi  (Cool, something like this might make it possible to mod a real 2600 to read “paper cartridges”.  Small world, Dr. Scott M Baker wrote BBS stuff too, including Land Of Devastation as well as Door Driver, a utility that allowed a dumb kid like me to write BBS games)

How to setup quadview for competitive gaming

Why would you want a quadview setup?

I (badly) play PUBG with my family in a room where we can’t really sit close together and this always happens:

“Sniper in that building.  Upper left window”

“Which building?  Which window?!”

“Look at my screen…  here!”

Turning around to crane your neck at someone else’s computer will get you killed – quadview to the rescue!  A quadview setup puts a live feed of everyone’s screen next to your normal screen.

In the very unlikely chance you’re in a situation where this could be useful, here’s how to do it.

Hardware I use

An HDMI Quad Multi-viewer (like this one $99) – this has four HDMI inputs (to plug in each person’s view) and one HDMI output.  This is the thing that can actually take the four views and smoosh it into one HDMI signal.

Assuming that each team member is sitting far enough away from each other to need their own quad view, you also need to split this HDMI out into up to four outputs.

An HDMI 1in 4 out splitter (like this one – $18)

Up to four extra monitors to show the quad view – if you’re like me, you have random old monitors (or Contique) laying around that can display HDMI signals.

Getting the inputs – Use OBS, NOT screen mirroring!

So now you need a signal from each person’s computer to plug into the switcher.  I don’t think it’s possible to buy a video card that doesn’t have an extra HDMI out on it these days, but having that is step one.

Run an HDMI cable from the switcher input to a free HDMI output port on your video card.  Open Windows’ Display Settings and verify it’s showing up as the second display.

You may be tempted to choose “Mirror display 1 on display 2” – this will only give you headaches later!  It will reduce the refresh rate and resolution of your main monitor to match this second one; you probably don’t want this.  It’s also kind of weird and buggy in my experience.

Note: I play 100 hz refresh at 3440X1440 – the method we use below will automatically letterbox the screen to fit the 1920X1080 output for quadview nicely.

The best way is to “extend desktop” to the second monitor.  We’ll also need OBS Studio. It’s a great free open source app used primarily for streaming but works great for this.

Configuring OBS Studio

After installing OBS, the setup is pretty simple.

Add a “Game capture” to the sources property.

Now, this is key, right click your scene’s name under “Scenes” and choose “Fullscreen Projector” and set it to your second display.  If you start a game, you should see its output being “broadcast” via the HDMI going to the quad-view box’s input.

If you want to be cool, add name overlays in OBS.  That’s it, you’re ready to roll.

Some tips

  • It takes practice to use,  easy to forget it’s there.  If a squadmate is in a fire fight a quick look can give you important information on her situation before choosing to bust in crazy-like or stealthily approach from a specific direction.
  • If one of your local squad dies and you’re playing with a remote player, the dead person can switch to the remote teammate’s view so everybody can watch what he’s doing as well.
  • Turn “equipment hud” and “weapon hud” options to ON –  if the display is big enough it’s possible to see the players backpack, armor, and gun situation.
  • If you’re REALLY serious you can buy a more expensive box that puts the four inputs together into 4K output –  don’t know about you, but I sure don’t have extra 4k monitors sitting around though.

Got a Vive Pro – initial thoughts

Is it worth the money? How different is it?

So I broke down and got a Vive Pro despite its exorbitant price tag. Is it worth it?  Well… probably not, unless you’ve got money to burn.

I hate the idea of playing something like Skyrim VR with the old Vive when I know I could be seeing something prettier if I had better hardware.  It’s like that feeling of sadness I had playing Quest For Glory 2 before I had a sound card; I knew I was missing out on some of the experience.

It’s got some nifty tech inside that may be useful later though – dual cameras for AR stuff and Hololens-like collider detection as well as eventual 10×10 meter room support when the new base stations are released. (a 10×10 VR play space in Japan? let me pause to laugh uncontrollably followed by a single tear down the cheek)

In a year they’ll probably have one neat package with all the new stuff, so better for most to wait for that.

I can’t quite put my finger on it, but I felt like the optics were slightly blurrier on the peripheral areas as compared to the original.  Might be my imagination or something specific to my eyes, dunno.  I took some pics through the lenses with both devices with this setup to compare:

 

It’s all quite scientific, let me assure you. Yeah.

I couldn’t really notice a different in the edge lens distortion from the pics.  Here’s a comparison of the square from the middle, you can see there really is less screen-door effect now though.

Pics are from Skyrim VR

My NVidia 1080ti seems to run content at the same FPS as the old Vive, so so no real downside to the switch I guess.  It seems about equally as comfortable as the original Vive, that is to say, extremely uncomfortable.

4/24 2018 Update: HTC has announced an “aimed at the enterprise” $1399 Vive Pro full kit that includes the new 2.0 base stations and controllers, which in theory will offer better tracking and huge spaces.  A word of warning – unless they just started shipping with a new cable, the Vive Pro cable is the same length as the Vive, meaning larger spaces wouldn’t do you much good until the wireless addon is released later this year. (?)