Author Archives: Seth

Universal Game Translator – Using Google’s Cloud Vision API to live-translate Japanese games played on original consoles (try it yourself!)

Why I wanted a “translate anything on the screen” button

I’m a retro gaming nut.  I love consuming books, blogs, and podcasts about gaming history.  The cherry on top is being able to experience the identical game, bit for bit, on original hardware.  It’s like time traveling to the 80s.

Living in Japan means it’s quite hard to get my hands on certain things (good luck finding a local Speccy or Apple IIe for sale) but easy and cheap to score retro Japanese games.

Yahoo Auction is kind of the ebay of Japan.  There are great deals around if you know how to search for ’em.  I get a kick out of going through old random games, I have boxes and boxes of them.  It’s a horrible hobby for someone living in a tiny apartment.

Example haul – I got everything in this picture for $25 US! Well, plus another $11 for shipping.

There is one obvious problem, however

It’s all in Japanese.  Despite living here over fifteen years, my Japanese reading skills are not great. (don’t judge me!) I messed around with using Google Translate on my phone to help out, but that’s annoying and slow to try to use for games.

Why isn’t there a Google Translate for the PC?!

I tried a couple utilities out there that might have worked for at least emulator content on the desktop, but they all had problems.  Font issues, weak OCR, and nothing built to work on an agnostic HDMI signal so I could do live translation while playing on real game consoles.

So I wrote something to do the job called UGT (Universal Game Translator) – you can download it near the bottom of this post if you want to try it.

Here’s what it does:

  • Snaps a picture from the HDMI signal, sends it to google to be analyzed for text in any language
  • Studies the layout and decides which text is dialog and which bits should be translated “line by line”
  • Overlays the frozen frame and translations over the gameplay HDMI signal
  • Allows copy/pasting the original language or looking up a kanji by clicking on it
  • Can translate any language to any language without needing any local data as Google is doing all the work, can handle rendering Japanese, Chinese, Korean, etc  (The font I used is this one)
  • Controlled by hotkeys (desktop mode) or a control pad (capture mode, this is where I’m playing on a real console but have a second PC controller to control the translation stuff)

In the video above, you’ll notice some translated text is white and some is green.  The green text means it is being treated as “dialog” using its weighting system to decide what is/isn’t dialog.

If a section isn’t determined to be dialog, “Line by line” is used.  For example, options on a menu shouldn’t be translated all together (Run Attack Use Item), but little pieces separately like “Run”, “Attack”, “Use item” and overlaid exactly over the original positions.  If translated as dialog, it would look and read very badly.

Here are how my physical cables/boxes are setup for “camera mode”. (Not required, desktop mode doesn’t need any of this, but I’ll talk about that later)

Happy with how merging two video signals worked with a Roland V-02HD on the PlayStep project, I used a similar method here too.  I’m doing luma keying instead of chroma as I can’t really avoid green here. I modify the captured image slightly so the luma is high enough to not be transparent in the overlay. (of course the non-modified version is sent to Google)

This setup uses the windows camera interface to pull HDMI video (using Escapi by Jari Kompa) to create screenshots that it sends to Google.  I’m using an Elgato Cam Link for the HDMI input.

Anyway, for 99.99999999% of people this is setup is overkill as they are probably just using an emulator on the same computer so I threw in a “desktop mode” that just lets you use hotkeys (default is  Ctrl-F12) to translate the active Window. It’s just like having Google Translate on your PC.

Here’s desktop mode in action, translating a JRPG being played on a PC Engine/TurboGrafx 16 via emulation. It shows how you can copy/paste the recognized text if want as well, useful for kanji study, or getting text read to you.  You can click a kanji in the game to look it up as well.  (Update: It now internally can handle getting text read as of V0.60, just click on the text.  Shift-Click to alternate between the src/dest language)

Try it yourself

Before you download:

  • All machine translation is HORRIBLE – this is no way replaces the work of real translators, it’s just (slightly) better than nothing and can stop you from choosing “erase all data” instead of “continue game” or whatever
  • You need to rename config_template.txt to config.txt and edit it
  • Specifically, you need to enter your Google Vision API key.  This is a hassle but it’s how Google stops people from abusing their service
  • Also, you’ll need to enable the Translation API
  • Google charges money for using their services after you hit a certain limit. I’ve never actually had to pay anything, but be careful.
  • This is not polished software and should be considered experimental meant for computer savvy users
  • Privacy warning: Every time you translate you’re sending the image to google to analyze.  This also could mean a lot of bandwidth is used, depending how many times you click the translate button.  Ctrl-12 sends the active window only, Ctrl-11 translates your entire desktop.
  • I got bad results with older consoles (NES, Sega Master System, SNES, Genesis), especially games that are only hiragana and no kanji. PC Engine, Saturn, Dreamcast, Neo-Geo, Playstation, etc worked better as they have sharper fonts with full kanji usually.
  • Some game fonts work better than others
  • The config.txt has a lot of options, each one is documented inside that file
  • I’m hopeful that the OCR and translations will improve on Google’s end over time, the nice thing about this setup is the app doesn’t need to be updated to take advantage of those improvements or even additional languages that are later supported

After a translation is being displayed, you can hit ? to show additional options.  Also, this is outdated, use the real app to see the latest.

5/8/2019 – V0.50 Beta – first public release, experimental
5/13/2019 – V0.51 Beta – Added S to screenshot, better error checking/reporting if translation API isn’t enabled for the Google API key, minor changes that should offer improved translations
5/30/2019 – V0.53 Beta – Added input_camera_device_id setting to config.txt for systems with multiple cameras.  Moves mouse offscreen for “camera” mode captures
9/5/2019 – V0.54 Beta – Fixes crash on startup problem some people had, adds “audio|none” config.txt command to optionally disable all sound.  Added “minimum_brightness_for_lumakey” setting to config.txt in case the default isn’t right
9/15/2019 – V0.60 Beta – New feature, text to speech!  You’ll need to enable Google’s Text To Speech API, Fixed a crash bug, added some in-app persistent settings, gamepad can now move around the cursor and click things.  Controls changed a bit. Added automatic reading of detected dialog, can choose to read src or dest langs, can hide text overlays if you want now.  A few new options in the config.txt. Switched to FMOD audio, SDL_Mixer has buggy mp3 playback which was causing some me grief. Changed the translate button sound to something more soothing.

Note: I plan to open source this, just need to get around to putting it on Git, if you’re someone who would actually do something with the source, please hassle me into doing it.

Download Universal Game Translator for Windows (64-bit) (Binary code signed by Robinson Technologies)

Conclusion and the future

Some possible upgrades:

  • Built-in text to speech on the original dialog (well, by built in I mean using Google’s text to speech API and playing it in UGT, easier than the copy and paste method possible now)
  • A built in Kanji lookup also might be nice,  Jim Breen’s dictionary data could work for this.
  • My first tests used Tesseract to do the OCR locally, but without additional dataset training it appeared to not work so hot out of the box compared to results from Google’s Cloud Vision.  (They use a modified Tesseract?  Not sure)  It might be a nice option for those who want to cut down on bandwidth usage or reliance on Google.  Although the translations themselves would still be an issue…

I like the idea of old untranslated games being playable in any language, in fact, I went looking for famous non-Japanese games that have never had an English translation and really had a hard time finding any, especially on console.  If anyone knows of any I could test with, please let me know.

Also, even though my needs focus on Japanese->English, keep in mind this also works to translate English (or 36 other languages that Google supports OCR with) to over 100 target languages.

Test showing English being translated to many other languages in an awesome game called Growtopia

Spawning annoying black holes in Fortnite to force a kid to exercise

Sure, there are ways to get exercise while gaming. Virtual reality and music games like Dance Dance Revolution come to mind.

But that’s all worthless when your kid just wants to play Fortnite.

Behold, the PlayStep!

This thing forces him to work up a sweat. This post details what methods I used and issues I had making it.  (Github source code for the program that runs on the Pi here for anybody who wants to make one)

Building a screen blanker connected to exercise isn’t a new idea (see the end of this post for related links I found) but my version does have some novel features:

  • Dynamically modifies the video and audio of the game’s HDMI signal to do things like partially obscure the screen in random ways
  • Uses an energy bank so you can save up game time.  This means you can madly pedal in the lobby and still sit in a chair during the critical parts of Fortnite

I first built a cheap version (~$120 in parts).  It just blanks the screen when you’re out of energy, and uses an LCD screen to show energy left.

I then did a better but more expensive way (~$700 in parts) but it’s a lot cooler.

The expensive version with HDMI in/out, the “enclosure” is a plastic basket thing from the dollar store

Things both ways have in common:

  • Use a Raspberry Pi 3B+ (a $40 computer with hardware GLES acceleration) with the Retropie distro – I start with it because its mouse/keyboard/GLES/SDL works out of the box with Proton SDK where normal Raspian requires tweaking/compiling some things
  • Use Proton SDK for for the app base (allows me to design/test on Windows, handles abstraction for many platforms so I can write once but run everywhere)
  • Use hall effect sensors to detect the pedal down position on each pedal via the Pi’s GPIO, this way a kid can’t cheat, he’s forced to move the full range of the stepper
  • The sensors are placed on a stepper exerciser.  I used a USB connector for the wiring so I could unplug/replace it later if I wanted to setup a different exercise machine, like if I ever got a stationary bike.

Yes, I’m about to duct tape an electrical taped sensor to a pencil that has been zip-tied in place. What? I never said I was pro

A note on using USB cables for wires and my idiocy

Each hall effect sensor requires three wires.  We have two sensors.  So we need to run six wires from the Pi GPIO pins?  WRONG! We only need four because the power and ground can be shared between them.

So I thought hey, I’ll use USB cables and connectors laying around as they have four wires in them. (until we get to USB 3+ cables, but ignore that)

Then I thought, if I could find a simple USB Y splitter, it will be easier to share the power/ground with the two sensors . (I’m not actually using this as a USB connection, it’s just so I can use the wire and handy plugs)

Wow, I found this for cheap on Amazon:

Perfect!  A lowly USB splitter that I’m sure just has no fancy electronics hidden inside

So I partially wired it up but when testing found that the middle pins had no continuity.  Can you guess why?

WHAT THE HELL IS THIS INSIDE THE CABLE?!

It’s got a hub or something hidden in the connector.  I never plugged it into an actual PC or I might have noticed.  No wonder it didn’t work.  I removed the electronics part (it was a horror, I shouldn’t be allowed near soldering irons) and it worked as expected. Moral of the story is, I’m dumb, and don’t trust USB splitters to just split the wires.

The cheap way (just screen blanking with LCD panel)

My “cheap” way ignores rendering anything graphical (It doesn’t output any HDMI itself) and just shows a single “energy count” number on an LCD screen.  When it gets low, the game’s HDMI signal will be completely shut off until it goes positive again.  In the video above I’m using little buttons to test with instead of the stepper.

To help the user notice the screen is about to shut off it makes a beeping noise as the counter nears zero.

I suggest never testing this at an airport, can’t stress that enough really.

So how can a Raspberry Pi turn on/off the game’s HDMI signal?

A splitter with no USB power = a dead signal

This is hacky but it works – I took an old 1X2 HDMI splitter and powered it from one of the Pi’s USB ports.  (lots of electronics these days use a USB plug for power)

I only use one of the outputs on the splitter as I don’t really need any splitting done.

It’s possible to kill the power on a specific Pi USB port using a utility called uhubctrl.

So when the player is out of “energy”, I kill the USB port powering the HDMI splitter by having my C++ code run a system command of:

./uhubctl -a off -p 2

And because the HDMI splitter is now unpowered, the signal dies killing the game screen.

After turning the USB port back on (replacing “off” with “on”) it will power up and start processing the HDMI signal again.  Originally I was using the Pi to turn on/off an entire AC outlet but that seemed like overkill – I was thinking maybe turning off an entire TV or something, but meh.

So the big downsize of this method is it takes up to 5 seconds for the HDMI splitter to turn back on, and your TV to recognize the signal again.  It works but… not optimal.  Also, in my case I don’t really have a good place to put the LCD screen or speaker for the beeping. (might make more sense on a stationary bike instead of a stepper)

Alternate way to disable the HDMI signal : Instead of this no-wiring hack, maybe instead run it through an HDMI cable but put one of the pins into a relay to turn that pin on/off?  Might be the same effect but cheaper and simpler.. although, which pin?!

The expensive but better way (offers more options with images and audio)

There isn’t enough drama in simply turning the HDMI signal on/off – wouldn’t it be better if holes started spawning randomly over your actual gameplay and you had to pedal to remove them as your screen became increasingly obscured?!  There are a million options, really.

The Raspberry Pi can generate the graphics (thanks GLES) and audio but we need a way to overlay its HDMI output over the game’s HDMI signal with no noticeable latency costs at 60fp.

This is known as a chroma key effect.  (Side note: I once bought a $5,000 video mixer in the 90s so I could do live-effects like this, a WJ-MX 50.  Just saw one on ebay for $100, damn it’s big)

The V-02HD. A lot cheaper than $5,000.

The cheapest stand-alone way I found to do it these days is a Roland V-02HD video switcher. (I bought it for $664 USD from Amazon Japan)

Does anybody know a better/cheaper alternative? If I could figure out a no latency way to overlay with an alpha channel instead of just chroma that would really be ideal.

It’s pricey, but it works perfectly.  It has the following features of interest:

  • Remembers all settings when powered on, including chroma key mode and color/sensitivity
  • Can disable auto-detection so inputs 1 and 2 are always the same even if input is turned off
  • Can disable all buttons/levers on it so accidental changes won’t happen (we don’t need them active, it’s just a black box to us)
  • It’s pretty small for a video switcher
  • Mixes audio into the HDMI signal from both inputs
  • No noticeable latency

Although I didn’t need or use it, it’s worth noting that it can show up as a USB MIDI device and be controlled via MIDI signals.  I did not need those features but that’s pretty cool, assuming the Pi could work with it, you could do transitions between inputs or enable/disable effects.

The Software

With no color keying, this is what the raw Pi video out looks like

The software to control things uses Proton SDK with its SDL2 backend and WiringPi for the GPIO to read from the sensors.  It’s modified from the RTBareBones example.

It uses a config.txt file to adjust a couple things:

max_energy|600
energy_timer|1000
energy_per_move|7

PlayStep Source code on github

Here’s some info on how to compile Proton examples.

To allow the Pi to correctly output 1080P HDMI even if the switcher hasn’t booted up yet, I edited the /boot/config.txt  and set:

hdmi_force_hotplug=1
hdmi_drive=2

To fix remove the unnecessary border I also set:

disable_overscan=1

Final thoughts

Might be fun to simply design Pi powered pedal games that use the stepper as a controller.  You could then output straight to a TV or TFT screen without worrying about the spendy chroma-keying solution.

I mean, sure, my kid would refuse to play it, but it could be a funny thing to show at a meet-up or something.

Related things to check out

  • Cycflix: Exercise Powered Entertainment – Uses a laptop to pause netflix if you don’t pedal fast enough.  He connected an arduino directly to the existing stationary bike electronics to measure pedaling, smart.
  • No TV unless you exercise! – Arduino mounted on a stationary bike cuts RCA signal via a relay if you don’t pedal enough.  Uses a black/white detector for movement rather than hall effect sensors.
  • TV Pedaler – A commercial product that blanks screen if you don’t pedal enough that is still being sold? The website and product seem really old (no HDMI support) but they accept Paypal and the creator posted here  a few years ago about his 1999 patent and warned about “copying”.  Hrmph.  His patent covers a bunch of random ideas that his machine doesn’t use at all. Patents like this are dumb, good thing it says “Application status is Expired – Fee Related” I guess.
  • The EnterTRAINER – This defunct commercial device is basically a TV remote control with a heart monitor you strap to your chest.  Controls volume and TV power if your heart rate goes too low. Its hilarious infomercial was posted in one of the reviews.
  • The 123GoTV KidExerciser – Ancient commercial product that lets you use your own bike in the house to blank the TV if not pedalled fast enough.  Company seems gone now.

 

“Let’s war!” a free gamelet I wrote over the weekend for Ludumdare 43

I made this little free game for the Ludumdare gamejam over the weekend. A theme is given at the start. The theme was “Sacrifices must be made”

Play it in your browser here

Unity source code download

Its ludumdare entry page

Want to see more LD games? Try the Ludumdare Archive youtube channel

Audio credits:

“Joshua Tree Windstorm”, “Summer Of Synth” by Chris Huelsbeck, used under a royalty-free license – https://www.patreon.com/chris_huelsbeck for more info

Do you use white noise to sleep or concentrate? Try my new iPhone app: Misophonia Sleep Kit

“I can hear you breathing… sorry kid, but you’ve got to stop or leave the room” is something Cosmo has heard many a time while using the computer next to me.

As my long suffering family can attest, I’ve had a “thing with sound” for a long time.  It’s one of the reasons I love my job – I’ve worked alone in a room for most of the last thirty years.

Introducing… my white noise app! I wrote it because I couldn’t find another app that would  automatically “mix” a small amount of noise to the podcast I like but have it switch to a different setting when it detects no other audio is playing so I can sleep through the night.

It’s easy to use, has the perfect custom mixable sounds, and is totally free.  If you use white noise at all, check it out.

Misophonia Sleep Kit on the App Store (requires an OS 10+ iPhone, iPod or iPad)

 

Postmortem: IGF 2003 finalist Teenage Lawnmower – now free, open sourced

Originally $19.95, our old game Teenage Lawnmower (originally released in 2002) is now a free download.  It requires Windows XP or newer to play. Tested fine on Windows 10.

I’ve released a new  V1.17 “final freeware” release with the following changes:

  • Tweaked the optional gamepad controls to work with the Xbox 360 pad (at TLM’s launch, the 360 pad didn’t exist yet)
  • Payment for mowing jobs is now slightly randomized, it was an easy change, just had to edit some .c scripts, makes the game slightly easier as well.
  • The text version of the .c script files are now included uncompressed/encrypted.  You can edit a .c file in any text editor and the game will instantly be changed – its scripting system doesn’t require compiling or anything.
  • Signed both the .exe and installer with the RTsoft Windows certs (it’s no longer “untrusted”) and will properly show Robinson Technologies as the maker of the installer

Download Teenage Lawnmower V1.17 (full version) for Windows here. (14 MB)

For posterity the source code/etc is now on github.  Anybody is free to steal its 3D engine, its C style scripting engine, or try to port the game to something else.

Warning & disclaimer: It’s 15 years old, I don’t recommend learning from or re-using this code. I mean, it’s better than Dink Smallwood’s code, but I wouldn’t call it elegant or anything.

The Teenage Lawnmower postmortem

Now, while I’m here, I thought it would be fun to talk about the experience Akiko and I had of developing it, how many copies it sold, stuff like that.  For reference, I’ve kept the original TLM website frozen in time here.

“Get ready to smoke some weeds” the website says

My dream was to write a 3D RPG.

In those days, that meant writing your own 3D engine – there was nothing off the shelf you could start with like there is today that would have the power I’d need.

Having been through the gauntlet of completing something as massive as a role playing game previously (Dink Smallwood) and barely surviving, I knew that the key would be to develop the engine using smaller sub-goals.

Teenage Lawnmower (TLM) was a game design that didn’t require a single human 3D model and was very flexible for how many assets/levels we wanted to create.  It could test our 3D engine features like scripting, terrain, and weather, and could even be sold for some extra bucks. (in theory…)

Akiko uses Poser to create a redneck for Teenage Lawnmower.  Poser was kind of the Unity asset store of the day.  (photo by Seth Robinson, 2002)

TLM was the stepping stone to something greater.  Spoiler alert, looking back, it was the only game I sold that was completed using my 3D engine, the RPG never happened.

So without this “stepping stone” I probably would have nothing at all to show for all that work on the engine.

Working for years on engines and ending up finishing nothing is very common in my circles, big games are <font size=1,000,000,000>REALLY</font> hard to finish.

How long did it take to write?

It took around four months from start to finish.

I’d already been working on the 3D engine for a while before we started TLM though, if I included that, it would be much longer.

The order/key/DRM system

Ordering via RegNow (and later Paypal) would automatically generate and send a custom text key to the buyer within a minute. (sort of like a Steamkey)

It worked like this:

  • They run the full version, it asks for an unlock key
  • After entering it, it would connect to the server running from my house
  • The server would generate an unlock key based on a unique id that the client sends with the unlock key (a checksum of the user’s hard drive serial if I recall correctly)
  • The server would refuse to generate it if the max # of unlocks were reached (5 per year?)
  • There was a utility that could be run to register games on computers that didn’t have an internet connection, but it was a hassle for all involved

I went overboard with the DRM.  I think I was overlay concerned about this because previously Legend Of The Red Dragon was pirated at a huge scale and our RPG Dink Smallwood had hit the warez sites the same day as its release.  It hurts to know someone who pre-ordered the game probably only did so to pirate and share it.

TLM was never pirated as far as I know.

I later used this same anti-piracy system for Toolfish and Dungeon Scroll.  After a few years, I removed the DRM from all of my software.  Did I ever really need it?

Sales figures

I still have all the sales data so here it is. We started selling it using a payment processor called RegNow.  The first sale happened on August 28th, 2002.

Sales at $14.95 to $19.95 via RegNow: 399
Sales for less ($5 to $15): 58
Sales net revenue: Around $6,500

So that breaks down to $1600 a month (ignoring support and just looking at the four months of development) for two people.

How could we survive on only that revenue?  Well, we didn’t…

  • I spent most of the year before doing programming contract work for clients
  • We were living with Akiko’s parents as we saved for a house
  • Akiko was teaching English part time at a Yamaha English school

The Independent Games Festival

A friend clued me into the IGF and told me I should enter TLM. (Thanks Geoff Howland!)  The IGF entry deadline pushed me to finish the game quicker, I sent off the entry submission CDs the day after the game was done.  (yeah, in those days you had to send 10 CDs)

We were energized by being chosen as a finalist, despite the $2000+ cost for the two of us to attend the conference from Japan.  I mean really, that ate up a third of our earnings from the game!

I didn’t learn though, I went back in 2003 when Dungeon Scroll was a finalist as well.  We joked that we just couldn’t afford to enter the damn IGF anymore.

A frazzled Seth shows off his game at the IGF (photo by Akiko, 2003)

My cheapo laptop could only run the game at 20 FPS or so.  Hosting a booth takes a lot of energy, the ability to talk up your game, and an outgoing personality.  Yeah, I’m zero for three.  I’ll be happy to never run a booth again!

Akiko sits at the finalist table during the awards show. She’s 5 months pregnant with Cosmo. (photo by Seth Robinson, 2003)

Was it worth spending that much to go even though we didn’t win jack? I don’t know, it was a good experience or whatever.  It was my first time to the Game Developers Conference.

Being a finalist didn’t seem to generate many sales or website hits, not enough to notice anyway.  It did score me a few interviews; I found this one with Gamedev.net.

Dekochan

Hard to see in this pic, but those are beagles down there!

The beagle was modeled and textured from our real-life Beagle, dekochan.  It’s really his bark sound effects in there too.

He passed away earlier this year (2018), but I’m glad he got to be immortalized in this and other RTsoft games first. (Photo by Akiko)

Dekochan’s 3D model also made an appearance in our iOS game Tanked.

The connection to Tarzan

How would Tarzan put out forest fires? Actually the less said about this game the better

So what about this incredible RPG engine that was sure to be an investment in RTsoft’s future? Well, before TLM we did use it in another “stepping stone project”: a game called Tarzan: Guardian Of Earth for a gamedev.net contest.

Lightning that can set trees on fire and some other effects were re-used in TLM.

Tarzan won first place in the contest, which is why I politely shame the gamedev guys to cough up the prize for it in the interview I linked earlier. I was kind of bummed with the delay because I really could have used that graphics card.

They ended up buying the prize at a Toys R Us and giving it to me in person at the IGF event, nearly a year after I’d won it.  Unfortunately, graphics cards tend to age quickly…  (No, I’m not bitter at all, why do you ask?)

Fifteen years later, is Teenage Lawnmower any good?

The mowing itself isn’t quite as fun or interesting as it needed to be to not feel a bit grindy and boring after a while.  The mowing mechanic is kind of like Splatoon, you need to cover the ground (mowing is a little like painting) and there is a bomb thing that can mow a large area.

There are 17 “lawns” (think levels) which customers will ask you to mow.  You can say no to lawns you don’t like, and other customers will ask, but you’ll lose precious time in the day.

The  idea is that every lawn has a unique theme that is tied to story surrounding the customer.  Sometimes that story is even tied to the main story, like when your mom dates one of your freakier customers who live in the woods with dogs.

In the movie at the top of this article, you see a bunch of lemonade pitchers on the lawn.  The owner is an older woman trying to seduce you, she’s drugged the lemonade.  If you touch one with your mower, your controls get reversed, making it much harder.  If you touch three, well, you pass out and unspeakable things happen.

The second part of the video is mowing a guy’s lawn who has a mole problem.  If you kill a mole, that’s bad, but if you hit an empty hole, it gets removed and generates bonus dollars.

Later levels are more liberal with power up placements

A cemetery has skeletons reaching up through the dirt, a rich guy has money randomly spawning around his mansion, a golf course has ducks (yup, the one from Dink Smallwood) and a lady with only a tiny grass patch in the city presents a moral quandary concerning stealing from her wallet.

One of the crazier lawns in the game

The metagame to tie together the arcade sequences is earning money to meet various demands at home, which is the “story” portion.  The scripting system made it easy to quickly write and test dialog.  The story deals with topics like alcoholism and domestic abuse.

I was trying to get the player to care about what happens and push through to the next story dialog.  Choices you make can matter, for instance, your stepdad Todd will treat you differently depending on if you took his steak from the fridge or not the day before.

On day thirty you’re treated to one of three story endings based on which difficulty level you’re playing.

A note on scripting

I’ve un-protected the scripting files for this public release, so if you navigate to the script or scriptg directory, you can see how a game like this is setup and uses 99% scripts that can be edited on the fly.

Look at a few and make changes, you could make your own story or cheat!  The directory script/event holds the story for each morning/night, and script/levels holds the level logic for each lawn.

.pss files are particle systems I think, .wet is weather, I think .zon is the level and its terrain, and .obj describes the objects in the level.  Maybe.  The .X files are the models. Remember .x files?!  .dds files are textures.

Conclusion

Give the game a spin here if you dare. (14 MB)

So that’s how it was being an indie dev in the early 2000s.  Cya!