How to setup quadview for competitive gaming

Why would you want a quadview setup?

I (badly) play PUBG with my family in a room where we can’t really sit close together and this always happens:

“Sniper in that building.  Upper left window”

“Which building?  Which window?!”

“Look at my screen…  here!”

Turning around to crane your neck at someone else’s computer will get you killed – quadview to the rescue!  A quadview setup puts a live feed of everyone’s screen next to your normal screen.

In the very unlikely chance you’re in a situation where this could be useful, here’s how to do it.

Hardware I use

An HDMI Quad Multi-viewer (like this one $99) – this has four HDMI inputs (to plug in each person’s view) and one HDMI output.  This is the thing that can actually take the four views and smoosh it into one HDMI signal.

Assuming that each team member is sitting far enough away from each other to need their own quad view, you also need to split this HDMI out into up to four outputs.

An HDMI 1in 4 out splitter (like this one – $18)

Up to four extra monitors to show the quad view – if you’re like me, you have random old monitors (or Contique) laying around that can display HDMI signals.

Getting the inputs – Use OBS, NOT screen mirroring!

So now you need a signal from each person’s computer to plug into the switcher.  I don’t think it’s possible to buy a video card that doesn’t have an extra HDMI out on it these days, but having that is step one.

Run an HDMI cable from the switcher input to a free HDMI output port on your video card.  Open Windows’ Display Settings and verify it’s showing up as the second display.

You may be tempted to choose “Mirror display 1 on display 2” – this will only give you headaches later!  It will reduce the refresh rate and resolution of your main monitor to match this second one; you probably don’t want this.  It’s also kind of weird and buggy in my experience.

Note: I play 100 hz refresh at 3440X1440 – the method we use below will automatically letterbox the screen to fit the 1920X1080 output for quadview nicely.

The best way is to “extend desktop” to the second monitor.  We’ll also need OBS Studio. It’s a great free open source app used primarily for streaming but works great for this.

Configuring OBS Studio

After installing OBS, the setup is pretty simple.

Add a “Game capture” to the sources property.

Now, this is key, right click your scene’s name under “Scenes” and choose “Fullscreen Projector” and set it to your second display.  If you start a game, you should see its output being “broadcast” via the HDMI going to the quad-view box’s input.

If you want to be cool, add name overlays in OBS.  That’s it, you’re ready to roll.

Some tips

  • It takes practice to use,  easy to forget it’s there.  If a squadmate is in a fire fight a quick look can give you important information on her situation before choosing to bust in crazy-like or stealthily approach from a specific direction.
  • If one of your local squad dies and you’re playing with a remote player, the dead person can switch to the remote teammate’s view so everybody can watch what he’s doing as well.
  • Turn “equipment hud” and “weapon hud” options to ON –  if the display is big enough it’s possible to see the players backpack, armor, and gun situation.
  • If you’re REALLY serious you can buy a more expensive box that puts the four inputs together into 4K output –  don’t know about you, but I sure don’t have extra 4k monitors sitting around though.

Got a Vive Pro – initial thoughts

Is it worth the money? How different is it?

So I broke down and got a Vive Pro despite its exorbitant price tag. Is it worth it?  Well… probably not, unless you’ve got money to burn.

I hate the idea of playing something like Skyrim VR with the old Vive when I know I could be seeing something prettier if I had better hardware.  It’s like that feeling of sadness I had playing Quest For Glory 2 before I had a sound card; I knew I was missing out on some of the experience.

It’s got some nifty tech inside that may be useful later though – dual cameras for AR stuff and Hololens-like collider detection as well as eventual 10×10 meter room support when the new base stations are released. (a 10×10 VR play space in Japan? let me pause to laugh uncontrollably followed by a single tear down the cheek)

In a year they’ll probably have one neat package with all the new stuff, so better for most to wait for that.

I can’t quite put my finger on it, but I felt like the optics were slightly blurrier on the peripheral areas as compared to the original.  Might be my imagination or something specific to my eyes, dunno.  I took some pics through the lenses with both devices with this setup to compare:


It’s all quite scientific, let me assure you. Yeah.

I couldn’t really notice a different in the edge lens distortion from the pics.  Here’s a comparison of the square from the middle, you can see there really is less screen-door effect now though.

Pics are from Skyrim VR

My NVidia 1080ti seems to run content at the same FPS as the old Vive, so so no real downside to the switch I guess.  It seems about equally as comfortable as the original Vive, that is to say, extremely uncomfortable.

4/24 2018 Update: HTC has announced an “aimed at the enterprise” $1399 Vive Pro full kit that includes the new 2.0 base stations and controllers, which in theory will offer better tracking and huge spaces.  A word of warning – unless they just started shipping with a new cable, the Vive Pro cable is the same length as the Vive, meaning larger spaces wouldn’t do you much good until the wireless addon is released later this year. (?)

Funeral Quest (the multiplayer funeral parlor simulation) is now downloadable and free

The rumors of Funeral Quest’s death have been greatly… true

It’s been nearly five years since any would-be undertaker has graced its halls.  In 2013 the old XP machine that was running the only FQ server in the world was decommissioned – and that was that, a game like no other silently departed with nary a single youtube video left in remembrance.

I wrote it in 2001 it was to see if a BBS door-like financial structure could make sense in a modern web environment.

I charged $99 (I think?) a year for the enterprising admin (or, a SysOp if you prefer…)  to run their own FQ server which they could customize.  It was free for players.

Example of customizable data. I still use LORD color codes everywhere. (I’m pretty sure Greg Smith wrote this event btw…):

message|"It's like a stabbing ... but different," comments the coroner's office.
give_random|`wThe phone rings - It's Golden Oaks Retirement Home.\n\n`$"We've got some bodies for you..."
add_log|`7FUNERAL HOME CHOSEN\n`^Golden Oaks Retirement Home has announced that *NAME* from *HOME* will be hosting services for the recently departed.

Did the grand experiment work?

Well, no.  In retrospect it’s pretty obvious why: why would anyone need to run their own server when calling areas and zip codes no longer exist?  Why would anyone care that they could make a customized version of something so niche as a “funeral parlor simulator”?

The flash login screen. It somehow works perfectly in the latest version of Chrome!  For now…

For some reason I made the server contain a complete HTTP server and run under Windows.  Makes it really easy to setup and use but .. yeah, not  how I’d do it today.

Despite all that, I’m extremely proud of Funeral Quest and have been wanting to repackage a full version with all of the licensing related limitations ripped out for a while. (As I did with Dink Smallwood HD earlier this year)

Well, I finally found the time so here it is.  If anybody actually sets up a working public game of FQ let me know and I’ll help get the word out.  Check the FQ forums if you have problems or questions.

Funeral Quest Server free version (this is all you need to run your own FQ game)

Funeral Quest Flash Source (full flash source for the client part, not required)

The readme file inside:

Funeral Quest
Copyright 2001: Robinson Technologies, all rights reserved

The "should have been released for free eons ago" final free release

Released 3/28 2018
This is a special version of the Funeral Quest server that has been modified to no longer need a license, it's the "full version" so to speak.

It can be used to play the game locally or run a real server so hundreds of people can play together.


* This version has a few stability fixes since the official last release, I think
* I was nicely surprised that in my local testing (clicking the Logon 1 button) the game seemed to work fine on current versions of both Chrome and Firefox's Flash, if the FQ port remains on the default of 80, anyway. However, who knows what will happen on a real server...
* Run FQServer/fqserver.exe to start the server. Note that when minimized it runs as a tray app, so if it disappeared, check your system tray.

Problems or questions? Check out the official Funeral Quest forums:

Big thanks to FQ fans and sorry I didn't release this sooner. If you actually get it running and want the full C++ source code in all its MSVC2005 MFC glory, let me know, can probably do that.

-Seth A. Robinson (

Some random screenshots of FQ (some are of the server, and some a browser playing it)

The entire server (which includes its own HTTP server, text to speech notifications, GUI front end) + all Flash client files is less than 1.5 MB zipped

Akiko did all the artwork in Funeral Quest

Many game texts could be customized. A powerful C style scripting language with variable passing and functions was also available. (It’s the same scripting engine that was in Teenage Lawnmower)

It takes skill to read your customers. Training in psychology makes more information about their mood and feelings available.

Maybe now FQ can truly rest in peace.

How to get your Unity LLAPI/WebSocket WebGL app to run under https with AutoSSL & stunnel

<continuing my “blog about whatever random issue I last dealt with in the hopes that some poor soul with the same issue will google it one day” series>

The problem

So you made your new Unity webGL game using the LLAPI and it works fine from a http:// address.  But when you try with https, even with a valid https cert being installed, you get this error:

“Uncaught SecurityError: Failed to construct ‘WebSocket’: An insecure WebSocket connection may not be initiated from a page loaded over HTTPS.”

This is your browser saying “Look, the website is https, but don’t let that fool you; it’s using a normal old web socket to send data under the hood which isn’t encrypted, so don’t trust this thing with your credit card numbers”.

Unity (at the time of this writing) has no internal support for what we really need to be using:  a Secure Web Socket.  So where http has https, ws has wss.  So how do we connect securely if our unity-based server binary can’t serve wss directly?

A little background info about CPanel & AutoSSL

Note: I’m using CentOS 7 on a dedicated server with WHM/CPanel

Setting up your website for proper SSL so it can have that wonderful green padlock used to be a painful and sometimes expensive ordeal.

But no longer!  Enter the magic of CPanel’s AutoSSL.  (I think it’s using Let’s Encrypt under the hood as a plugin?)  Behind the scenes, it will handle domain validation and setup everything for you.  While it does need to renew your cert every three months, it’s free and automatic.  Add four new domains?  They will all get valid certs within a day or so, it’s great.

We can use this same cert to make your websockets secure as long as they are hosted at the same domain.

Setting up stunnel

This is an open source utility that is likely already included on your linux server box, if it isn’t, go install it with yum or something.

It allows you to convert any socket into a secure socket.  For example, if you have a telnet port at 1000, you could setup stunnel to listen at 1001 securely and relay all information back to 1000.

The telnet connection has no idea what’s happening and sees no difference, but as long as the outside user can only access 1001, plain text information isn’t sent along the wire and one or both sides can be sure of the identity of who’s connecting.

Depending on the stunnel settings, it might be setup like https where the client doesn’t have to have any certain keys (what we want here), or it could be like a ssh where the client DOES need a whitelisted key.

A way to test a SSL port is to use OpenSSL from the command line on the host server via ssh.  For example (keep in mind 443 is the standard https port your website is probably using):

<at ssh prompt> openssl s_client -connect localhost:443

<info snipped>
subject=/OU=Domain Control Validated/OU=PositiveSSL/
issuer=/C=US/ST=TX/L=Houston/O=cPanel, Inc./CN=cPanel, Inc. Certification Authority
No client certificate CA names sent
Peer signing digest: SHA512
Server Temp Key: ECDH, P-256, 256 bits
SSL handshake has read 4946 bytes and written 415 bytes
New, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES256-GCM-SHA384
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
 Protocol : TLSv1.2
<info snipped>
Start Time: 1518495864
 Timeout : 300 (sec)
 Verify return code: 0 (ok)

Hitting enter after that will probably cause the website to an html error message because we didn’t send a valid request. That’s ok, it shows your website’s existing SSL stuff is working so we can move on.

So first edit your /etc/stunnel/stunnel.conf to something like this:

pid = /etc/stunnel/

#we won't screw with changing this because we don't want to relocate/change permissions on our files right now
#setuid = nobody
#setgid = nobody

sslVersion = all
options = NO_SSLv2

#for testing purposes.. these should be removed later:
output = /etc/stunnel/log.txt
foreground = yes
debug = 7

accept = 29000
connect = 80
cert = /var/cpanel/ssl/apache_tls/

accept = 30000
connect = 20000
cert = /var/cpanel/ssl/apache_tls/

Next, still from the ssh prompt, run stunnel by typing stunnel.

Because we have foreground=yes set above it will run it in the shell, showing us all output directly, instead of in the background like it normally would. (Ctrl-C to cause stunnel to stop and quit)

Look for any issues or errors it reports.  The .conf file I listed aboveshows how to set it up for two or more tunnels at once, you likely only need one of those settings.

The “websitename1” part doesn’t matter or have to match anything.

The SSL cert is the most important setting.  You need to give it your private & public & CA info in  the same file.

Now, initially, you might try to setup your keys using the files in ~/ssl/keys and ~/ssl/certs but they seem to not have everything all in one nice file including the CA certs.  I figured out ‘bundled’ ones already exist in a cpanel directory so I linked straight to them there.  (replace with your website name)

If stuff worked, you should be able to test your SSL’ed port with OpenSSL again.  In the example above under “websitename1” I told it to listen at 29000 and send to port 80, for no good reason.

So to test from a remote computer we can do:

(you did open those ports in your firewall so outside people can connect, right?)

C:\Users\Seth>openssl s_client -connect
Loading 'screen' into random state - done
depth=2 /C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority
verify error:num=20:unable to get local issuer certificate
verify return:0
Certificate chain
 0 s:/
 i:/C=US/ST=TX/L=Houston/O=cPanel, Inc./CN=cPanel, Inc. Certification Authority
 1 s:/C=US/ST=TX/L=Houston/O=cPanel, Inc./CN=cPanel, Inc. Certification Authority
 i:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority
 2 s:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority
 i:/C=SE/O=AddTrust AB/OU=AddTrust External TTP Network/CN=AddTrust External CA Root
Server certificate
issuer=/C=US/ST=TX/L=Houston/O=cPanel, Inc./CN=cPanel, Inc. Certification Authority
No client certificate CA names sent
SSL handshake has read 5129 bytes and written 453 bytes
New, TLSv1/SSLv3, Cipher is DHE-RSA-AES256-SHA
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
 Protocol : TLSv1
 Key-Arg : None
 Start Time: 1518497616
 Timeout : 300 (sec)
 Verify return code: 20 (unable to get local issuer certificate)

Despite the errno=11093 and return code 20 errors, it’s working and properly sending our CA info (“cPanel, Inc. Certification Authority”).

Or, easier, let’s just use the browser instead for this one since we’re connecting to port 80 if it works in this case:

It worked, see the green padlock?  Oh, ignore the error the website is sending, I assume that’s apache freaking out because the URL request is different from what it’s expecting (http vs https or the port difference?) so it can’t match up the virtual domain.

From here, you should probably remove the debug options in the .conf (including the foreground=yes) and set it up to run automatically.  I just placed “stunnel” in my /etc/rc.d/rc.local file. (this gets run at boot)

Actually connecting using the Unity LLAPI

Congratulations, everything is setup on the server and you’re sure your web socket port is listening and ready to go.

While your server binary doesn’t need to change anything, your webgl client does.

You now need to connect to WSS instead of WS.  Example:

   _connectionID = NetworkTransport.Connect(_hostID, "wss://", portNum, 0, out error);
 catch (System.Exception ex)
   Debug.Log("RTNetworkClient.Connect> " + ex.Message);

That’s pretty much it.  If someone doesn’t care about https and decides to play over http, it still works fine. (internally the websocket code will still connect via wss)

If you want to see it in action, check out my webgl llapi multiplayer test project