Category Archives: Tech Tips

Subfish: Search youtube channels for spoken words, export clips to Premiere/DaVinci Resolve timeline

Features:

  • Download all subtitles from a channel or playlist
  • Search subtitles for keywords (regex supported)
  • Export video timeline of all found clips as .EDL for Premiere or DaVinci Resolve
  • Will notify you on startup if a newer version is available
  • Full source code on github so feel free to submit patches, report bugs or make feature requests.
Exporting found clips to a Premiere/Resolve timeline

Here’s an example of exporting a lot of clips at once based on finding a single word and using Resolve’s scripting to automatically add the count, date, and youtube video title each clip is from. (the only “work” I had to do was hand nudge the clips forward and back a little so only the correct word was heard instead of a few seconds before and after as well)

Download/install instructions (Windows)

To run Subfish, you need to download some Windows libraries from Microsoft first because I’m too lazy to make an installer for now. Requires Windows 7 or newer.

1. Install the .NET 5.0 Desktop Runtime. (Windows version is here)

2. Install the WebView2 runtime (Try here, look for x64 version probably).

3. Download the latest version of Subfish (in a zip) for Windows.

Inside the zip there is a folder called “Subfish”. Drag that folder onto your desktop (or somewhere) to extract it. Then enter it and run Subfish.exe. (The binaries are signed by RTsoft so Windows shouldn’t give you any trouble running it)

An exciting screenshot

Why I made this

Earlier I was doing some youtube research and needed to look through thousands of videos for spoken words. While I did figure out a way to do it using youtube-dl and text scanning utilities, it was a clunky process and I couldn’t instantly jump to the exact spot in videos to preview video without some shenanigans.

“This is stupid, someone must have made a slick front end for this…” and well, I couldn’t find one, so here we are. As for the name, well, check out my other free utility Toolfish!

The timeline export options were actually added for a friend, but that’s pretty handy too.

Info & Issues

Audio sync problem after importing the EDL timeline into DaVinci Resolve? I think this is a Resolve bug when importing something that has clips with multiple internal timings. No problem – I created a script to fix it, check the ScriptsForDaVinciResolve sub directory. The readme.txt there explains how to copy FixTimelineSync.py into %PROGRAMDATA%\Blackmagic Design\DaVinci Resolve\Fusion\Scripts\Comp so you can run Workspace->Scripts->Comp->FixTimelineSync in resolve.

Is there a way to automatically add date, video counter, video name etc on top of the videos in the exported timeline? Yes, I’ve done it with DaVinci Resolve scripts, hoping to do a tutorial on that later as it’s kind of tricky. The .json metadata we export with each video is useful for this.

It seems to stop after downloading around 4500 subtitles? This seems to be a youtube limitation. One trick is to download again in reverse order, so 9,000+ from a single channel is possible. I think with some changes to optimize youtube-dl I could have it “continue” pulling data in a much smarter way but I haven’t been bothered enough to try yet. (youtube-dl’s current date restriction options just don’t work for right for subtitles, it still checks every video in order)

I’m getting “This browser or app may not be secure.” when I try to login to my Google/Youtube account in the preview window?! Yeah, I started getting that recently too. Luckily it has nothing to do with the actual text/video extraction process, but clip previewing tends to show google ads if not logged on. I think you can fix this by enabling “Less secure apps” but I didn’t actually try it.

OSX/Linux support? Cross-compiling is a problem due to using WebView2 for now, so I guess that’s out. On a side note, in theory this does support Windows-10 ARM based devices too but I don’t have one to test with.

Privacy – On startup, Subfish visits www.rtsoft.com/subfish/checking_for_new_version.php?version=<version #> in its little web-browser thingie which will give a download link if a new version is out. That’s the only communication done with our servers.

Legal – Only use this product if it’s legal to do what you’re doing where you’re doing it. That’s probably good advice for life in general.

To report a bug or feature request – Post here, twitter, or drop me an email

Using computer vision to enforce sleeping pose with the Jetson Nano and OpenCV

(special thanks to Eon-kun for helping demonstrate what it looks like)

Imagine you HAVE to sleep on your back for some reason and possibly restrict neck movement during the night. Here are some options:

  • Tennis balls strapped to sides
  • Placing an iphone on chest/pocket and using an app (SomnoPose) that monitors position with the accelerometer and beeps when it detects angle changes. (it works ok but the app is old and has some quirks like not running in the background)

The above methods are missing something though – they don’t detect head rotation. If you look at the wall instead of the ceiling while not moving your body, they don’t know.

The tiny $99 Jetson Nano computer combined with a low light USB camera can solve this problem in under 100 lines of Python code! (A Raspberry Pi would work too)

The open source software OpenCV is used to processed the camera images. When the program can’t detect a face, it plays an annoying sound until it does, forcing you to wake up and move back into the correct position so you can enjoy sweet silence.

If you’re interested in playing with stuff like this, I recommend Paul McWhorter’s “AI on the Jetson Nano” tutorial series, the code below can be used with that.

I’m really excited about the potential of DIY electronics projects like this to help with real life solutions.

The Pi and Nano have GPIO pins so instead of playing a sound, we could just as easily activate a motor, turn a power switch on, whatever.

Of course instead of just tracking faces, it’s also possible to look for eye, colors, shapes or cars, anything really.

The Python code listing for Forcing you to sleep on your back

import cv2
import time
from playsound import playsound
import os

dispW=1024
dispH=768
timeAllowedWithNoFaceDetectBeforeWarning = 22
timeBetweenWarningsSeconds = 10

timeOfLastFaceDetect = time.time()
timeSinceLastDetect = time.time()
timeOfLastWarning = time.time()
warningCount = 0

def PlayWarningIfNeeded():
    global timeBetweenWarningsSeconds
    global timeOfLastWarning
    global warningCount

    if time.time() - timeOfLastWarning > timeBetweenWarningsSeconds:
        print ("WARNING!")
        warningCount = warningCount + 1
        os.system("gst-launch-1.0 -v filesrc location=/home/nano/win.wav ! wavparse ! audioconvert ! audioresample ! pulsesink")
        timeOfLastWarning = time.time()


bCloseProgram = False

cv2.namedWindow('nanoCam')
cv2.moveWindow('nanoCam', 0,0)
cam = cv2.VideoCapture("/dev/video0")

cam.set(cv2.CAP_PROP_FRAME_WIDTH,int(dispW))
cam.set(cv2.CAP_PROP_FRAME_HEIGHT,int(dispH))
cam.set(cv2.CAP_PROP_FPS, int(10))
face_cascade = cv2.CascadeClassifier('/home/nano/Desktop/Python/haarcascades/haarcascade_frontalface_default.xml')
fnt = cv2.FONT_HERSHEY_DUPLEX

while True:

    ret, frame = cam.read()
    frame = cv2.flip(frame, 0) #vertical flip

    #rotate 90 degrees
    #frame = cv2.rotate(frame, cv2.ROTATE_90_CLOCKWISE)
    gray=cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
    faces = face_cascade.detectMultiScale(gray, 1.3, 5)

    for (x,y,w,h) in faces:
           cv2.rectangle(frame, (x,y), (x+w, y+h), (0,255,0), 4)
           timeOfLastFaceDetect = time.time()
   
    timeSinceLastDetect = time.time()-timeOfLastFaceDetect
   
    if timeSinceLastDetect > timeAllowedWithNoFaceDetectBeforeWarning:
         PlayWarningIfNeeded()
        
    text = "Seconds since face: {:.1f} ".format(timeSinceLastDetect)
    frame = cv2.putText(frame, text, (10,dispH-65),fnt, 1.5,(0,0,255), 2)

    text = "Warnings: {} ".format(warningCount)
    frame = cv2.putText(frame, text, (10,dispH-120),fnt, 1.5,(0,255,255), 2)

    cv2.imshow('nanoCam',frame)
    if cv2.waitKey(10)==ord('q') or cv2.getWindowProperty('nanoCam',1) < 1:
        bCloseProgram = True
    
    if (bCloseProgram):
        break

cam.release()
cv2.destroyAllWindows()

Magnavox Odyssey 2 Atari Joystick mod

$179 in 1978. That’s $705 in 2019 money.

Meet the Odyssey 2

It’s an old ass game console from 1978 I recently picked up from Ebay. It even has a built in keyboard for some reason.

In the first real “console war” it placed third out of well, three, ending up behind the Atari VCS and Intellivision despite some neat addons like a voice module.

It’s famous for hosting the game K.C Munchkin, a pacman-like game that was forced off the market for being a clone in a giant lawsuit.  However, looking back at it, it played different enough that this really shouldn’t have happened. Sorry K.C, you deserved better.

I buy old systems like this and try to fix them up because it’s a cheap and fun way to learn electronics and gaming history.  Oh, and you get to play your homework!

Care for a game of golf? It’s actually not bad. Your golfer gets mad when he hits trees.

For this one I did a composite video mod as that’s a huge improvement over the original noisy RF that requires a US TV tuned to channel 3 or 4.

The joysticks it came with were completely broken – unfortunately finding replacement controllers for old systems is quite difficult and fixing them to work like new almost always requires new membranes or other parts that are not made anymore.

The solution? Wire it up to accept a standard Atari 2600 controller!  It’s kind of a ubiquitious standard that’s compatible with the Commodore 64 and believe it or not, Genesis/MegaDrive controllers.  I got the idea from ArcadeUSA’s youtube video but I did it a slightly different way so I could also use the original controllers as well.

The electronics for the Atari 2600 controllers are the same but the pinouts are different so you can’t plug them directly in. I used what I had on hand,  a “6inch DB9 Female Port to Dual DB9 RS232 Male Serial Y Splitter Ribbon Flat Cable” for the adapter.  $4 from ebay, I cut off the parts I needed with scissors.

Click for the big version.

I soldered at the pins below the DB9 jacks, easier there.  Above is a labeled picture that might save somebody time.  Notice that the player 2 wiring is SLIGHTLY different from the player 1.

It’s ugly the way they stick out and will surely break soon. But it works great!

Conclusion

It would probably be better to just make DB9 to DB9 adapters to fix the pin differences (could unplug them when not needed), but I didn’t have the right stuff handy.  Someone should make a simple circuit board to do both ports together, something that looks like this useful  joystick port toggle I got for my C128.

Thanks Seth, something I didn’t care about.  When will you actually make a game

Yeah, yeah.  We’ve been working on stuff behind the scenes and will soon be upgrading RTsoft to a new HQ in Kyoto.  The idea is to be a real (well, slightly more legit than now anyway) game studio as well as a sometimes kind of public hackerspace/cafe, more later.

Input lag fun – measuring Atari 2600 latency from controller to display with an Arduino

Input lag.  It’s a catch-all name people use when talking about the latency that gets added in the various places between when you push a button and finally see results on your screen.

Many (especially console) games these days are designed to hide it because the developers cannot predict the latency of the display device the player is using.

There are often additional considerations such as video drivers, vsync, “Game mode” display options, refresh rate, back buffering and dealing with high latency wireless controllers.

If I add 100 milliseconds of additional lag to Red Dead Redemption 2, I’d doubt you’d notice with its mushy-ass controls.  But if you try that with old school platformers and bullet hells designed for ultra-low lag, well, it ain’t gonna be pretty.

If you’re skeptical of the difference input lag can make, try this:

Super Mario Bros. 2 (Japan) is just mean. See that mushroom? It kills you!

Play the murderous Super Mario Brothers 2 (Japan) directly from a Famicom on an old school CRT and then try the same thing on your Raspberry Pi, PC, or even the same NES console through an upscaler/LCD tv.

As you switch back and forth you’ll probably feel the difference.  The game is more difficult and less comfortable with the extra lag.

It’s not just about reaction times – there is this thing your brain does where it’s forced to jump and move slightly earlier than the onscreen action.  We can all automatically do it, sure, but it’s… different.  It doesn’t feel as connected.  It’s too.. I don’t know, milky.  A lot depends on the number of frames missed as well as when the console polls. (for example, the Atari 2600 polls 30 times a second, during the vertical blank interrupt but )

This goes for much of the 8 and 16 bit action content from consoles and computers of yesteryear.

So real hardware and CRTs are the only way to go for fast response controls?

Woah, settle down, I didn’t say that!  Retro emulation is astonishing and with the right gear and settings it should be possible to match or even beat old-school latency in specific cases such as with Run Ahead on the NES.

That said, I have not been able to do it yet with my setups.  Additionally, there are sometimes trade-offs like screen tearing and visual artifacts when you’re aiming at ultra-low latency.  I’m sure things will continue to improve given the recent focus on reducing input lag on both displays and controllers.

(Note: I was going to say “twitch gaming”  instead of “fast response controls” but I’m guessing that term is too ambiguous these days)

Measuring input lag accurately

Instead of getting lost in subjective testing by “feel”, let’s get scientific about it.  For my purposes, I wanted to measure the exact latency between a button press on an Atari 2600 console and the display pixels changing.  The full trip.

<disclaimer – I’m an electronics amateur and don’t know what I’m doing.  Breaking stuff is how you learn, right?> 

I used a cheap Arduino Uno with an LCD/button shield (they call things you stick on them shields, ok?) to do it.  It’s so simple you don’t even need a breadboard!

The light sensor

First I wired up an LDR (Light Dependent Resistor) to the board’s analog in pin A0, and connected the other end to the ground.  We can now detect light with a analogRead(A0) command.  Oh right, I also put a 100K resistor in there.

To get the light sensor to physically stick on the screen (so it will hover over the pixels in question) I commandeered the suction cup that came in a iFixit kit and glue-gunned a rubber earbud cover to house it.

Just nod and smile please, without commenting how there is glue everywhere and it looks ridiculous.

My light sensor returns a number between 300 and 1024 or so.  When stuck onto a monitor more like 800 to 1000 (black and white… black pixels still have a lot of light coming through I guess) but good enough.

Getting the Arduino Uno to push the Atari fire button digitally

Next I cut the cable off a broken Atari 2600 controller (it was a cheap clone from Ebay that I broken during a particularly exuberant C64 performance) and using this diagram figured out the wires for the fire button and ground.

I connected the controller ground wire to the Arduino’s ground pin, then the fire button wire in the D2 pin.  I can now control the fire button like this:

 pinMode(C_ATARI_BUTTON_PIN, OUTPUT);

//To turn button off
 digitalWrite(C_ATARI_BUTTON_PIN, HIGH);

//To turn the button on
 digitalWrite(C_ATARI_BUTTON_PIN, LOW);

I didn’t know you could directly wire it like that, mixing the 5v high signals from both the Atari and Arduino, but whatever, it works.  Read this post by BigO to understand why setting it to LOW causes it to be “on” as far as the Atari is concerned.

I noticed if the Arduino is NOT powered buts its joystick lead is plugged into the Atari, there are weird video glitches.  I guess due to the unpowered Arduino draining amperage from the fire button lead, enough to cause fluctuations in the entire system?  Ignore any smoke, move along.

Adding support for more buttons would be as easy as plugging the additional wires into more Arduino pins.  In the picture of the whole device above, it’s only the Red and Yellow wires I’m using, the blue/white ones aren’t connected to anything.

The code

All that’s left is to write some code so the device can be controlled with the LCD shield’s buttons.   Here’s what those buttons do:

  • Select – Show current light level reading
  • Left – Nothing
  • Up – Increase light change timing required to trigger
  • Down – decrease light change time required to trigger
  • Right – Start measuring.  (Will cause the Atari’s fire button to click, then it waits until the light has changed enough.

Here is the code for the Arduino sketch. (I did all this in a messy few hours, don’t judge me)

Tip: I leave the button in the “pressed” state for 100 MS (I guess 33.333 MS would technically be enough for an Atari 2600, but whatever, doesn’t matter), and I look for about a 60 change from the light sensor to count as “hey, this part of the screen has  definitely changed, stop the timer, we’re there”.

You can’t see it, but I’m pulling the Reset button on the Atari 2600 between tests so it’s on the right screen for the Arduino to send the “fire button” when I start a measurement.  The game is Xenophobe.

Testing results:  Atari 2600 console on CRT

  • Sensor in the upper left of the CRT return between 0 and 33 MS.
  • Sensor in the buttom left of the CRT returns between 16 and 33 MS.

This seems about what it should be, give or take 1 MS or so?  It’s possible to get near 0 MS from button push to pixel display.  I mean, I guess I knew it was, but it’s still fun to measure it.

So why did I use the Xenophobe cartridge?  Because it’s just the first game I found that clearly changes a dark block of the screen to a lighter color instantly when a button is pressed. (I wouldn’t want to use light to dark due to possible ghosting issues)

There are probably homebrew roms built to do this for most system but I couldn’t find one with a cursory googling so here we go.

Testing results: Atari 2600 console with upscaler, various video switchers & old Dell LCD monitor

  • Sensor in upper left of panel returns between 79 and 130 MS

Ouch.  Well, I knew it wasn’t going to be good, I can only imagine how bad it would be with a Pi or something instead.  Anyway, I won’t go serious into testing (I’m no My life In Gaming)  or my exact hardware in my retro area (it’s weird…), I just want to be ready for when I need to compare latency on my setups down the road.

Conclusion & random thoughts

I’d like to test the full “controller to display” latency on my Raspberry Pi & main computer setups as well but I think that means I’d have to hack into a 360 controller so the Arduino can control the button as we did with the Atari.   Maybe later.

Would be wondrous to get to a place where we could once again write games counting on the C2S  (controller to screen) lag being low and knowing everybody is getting the best experience.

You always want to know your server ping time when you play online, right?

Well, maybe we could start building in latency test systems so a console (using its camera? or maybe all TV/monitors’s should have a tiny light sensor in a corner that could be queried via HDMI) would know its own latency (both visual and auditory) and a game could use that data to automatically sync up stuff for rhythm games, adjust game timing of quick-time events, or even to just display a “congratulations, you’ve got horrible latency” warning if needed.

Here’s a Chrome extension that scrapes Sony’s website to get a list of owned Playstation games

If you’ve ever wanted a list of your digitally purchased/downloaded Playstation games (I have over 500 <sigh>) I wrote this chrome extension that can do it as a test for something else.

It’s crap because it will break if the Sony site changes at all, as well as missing games that weren’t purchased digitally (disc games, for example) but hey, here it is.

Click here for Github source & install/usage instructions

The data it creates is in json format and looks like this:

"games": [
    {
      "title": "Judgment",
      "Size": "30.75GB",
      "PurchaseDate": "7/4/2019",
      "Platforms": [
        "PS4"
      ],
      "productID": "UP0177-CUSA13186_00-JUDGMENTRYUGAENG"
    },
    {
      "title": "Borderlands: The Handsome Collection",
      "Size": "28.64GB",
      "PurchaseDate": "7/4/2019",
      "Platforms": [
        "PS4"
      ],
      "productID": "UP1001-CUSA01401_00-BORDERLANDSHDCOL"
    }, ... and so on

If I cross referenced it with trophy data it would be more accurate.  If anyone knows a better way to get at this data (one that doesn’t break if the user isn’t using English for example…) please let me know.

No plans to add anything else to this but  wanted to throw up a post about it so anybody else working on something similar could find the source if needed.