Java and Gradle Continuous Integration Builds Using Github Actions


There are a million CI solutions available to engineers these days, but one of simplest to integrate with a simple Github project is the one built right into Github. Actions. Here's a quick process for setting up a Java project with Gradle to run your tests on every commit for every branch automatically.

Just drop into your project a file named .github/workflows/continuous-integration-workflow.yml with the following contents:
name: Build
on: [push]

jobs:
  build:
    name: "David's Build"
    # This job runs on Linux
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v1
      - name: 'Set up JDK 1.8'
        uses: actions/setup-java@v1
        with:
          java-version: 1.8
      - name: 'gradlew build'
        run: cd ${GITHUB_WORKSPACE} && ./gradlew build
If you want to skip the standard gradle configuration and figuring out which gradle files need to be committed for this to work, have a gander at or fork my mit-licensed repo here. You can see the passing build here.

Programmatically Clearing Caps Lock In Linux

I'm not particularly a fan of caps lock. I typically re-map the caps lock key to do something more useful. Sometimes caps lock becomes enabled accidentally or because some application enables it to be "helpful". Here's a little python script I keep around to disable caps lock from a terminal/shell just in case.

#!/usr/bin/env python
from ctypes import *
X11 = cdll.LoadLibrary("libX11.so.6")
display = X11.XOpenDisplay(None)
X11.XkbLockModifiers(display, c_uint(0x0100), c_uint(2), c_uint(0))
X11.XCloseDisplay(display)
Source

Simple Sneakernet Backups

The easiest way to back up data is to synchronize your data to a coud. For this, I use Syncthing and various cloud services. But, if I accidentally delete something important and that delete operation is synchronized to my cloud-based backups before I catch it, that data is lost. For true offline and delete-resistant backups, nothing beats the sneakernet. The cheapest way to back up a home computer is to purchase a USB hard drive, plug it in, and copy files. And, you don't need any special software to make this work. Any POSIX machine with tar and gpg can make an encrypted backup.
First, disable sleep on the device that will be backing up the data.
Next run the following command to back up your data:
SOURCE=/home/you
DESTINATION=/media/mounteddrive/backup-2020-02-05.tar.gz.gpg
tar czvpf - $SOURCE | gpg --symmetric --cipher-algo aes256 -o $DESTINATION
You'll be prompted for a symmetric AES password.

And, to restore:
SOURCE=/media/mounteddrive/backup-2020-02-05.tar.gz.gpg
DESTINATION=/home/you
(cd $DESTINATION; gpg -d $SOURCE | tar xzvf -)
What's great about this is that you are using ubiquitous free open source tools. You know that wherever or whenever you plan to restore this data, you'll be able to.

Quick And Dirty URL Shortner On Any Site

First, I think this is a terrible idea and you should never do this. Second, I did this right here on this site. 

I wanted to add a URL shortener to my website so when people go davidron.com/something, the user will be redirected to some arbitrary location. I added the following to my 404 page:
<script language="javascript">
var key = window.location.href.split("/")[3];
var urls={
    'ssh':"http://sdf.org/ssh",
    'blog':"http://blog.davidron.com",
    'emacs':"http://ratamacu.freeshell.org/qe",
}

if(key){
    if(urls[key]){
        window.location.href=urls[key]
    }else{
        document.write("'"+key+"' not found :(");
    }
}
</script>
Now, I can go to davidron.com/ssh to open a terminal or davidron.com/emacs to download a qe binary.

This has several disadvantages:
  • It's a complete abuse of the 404 page, which should't redirect anywhere.
  • It's javascript, and a URL shortener should really use HTTP 3xx redirects.
  • It's slow because you have to load and render the 404 page, show (part of) the 404 page to the user, run the script on the 404 page.
  • Whatever security through obscurity there might be in a cryptic short URL is lost by the fact that I've published the entire database of URLS on my 404 page.
It also has a couple of advantages:
  • It's stupid easy.
  • It works.
  • It's easy to edit right in the HTML.

Backing Up Data In A Chrome Extension (Ears Audio Toolkit)



If you have a Chrome Extension with some state you'd like to back up for posterity, then this trick might work for you. In this case, I'm going to back up Ears Audio Toolkit, an amazing audio Equalizer that allows you to save equalizer presets for every audio device you have.

  • Right-click the extension
  • Choose inspect
  • In the left panel, select Application->Local Storage->chrome-extension://....
From here down.. the directions will depend on the extension. For Ears Audio Toolkit:
  • find the PRESETS key
  • Double click the JSON representing the value of PRESETS
  • COPY/PASTE the giant JSON blob.
Ears Audio Toolkit, offers a $1/month subscription to synchronize your data for you, but it doesn't really support capturing this data for your own personal backup and even versioning it in git. That being said, I strongly support throwing $5 or more at the author to show your support since this effectively bypasses the tool's business model. Also, since this is not really a supported trick, there's no guarantee the author won't change the format of the JSON blob and break this strategy.

Cox Data Usage Charges

Recently, Cox, a city sanctioned monopoly, has begun charging users for data usage over 1TB in my town, Santa Barbara.  While I don't actually believe that charging users for data usage is wrong, I do believe that there is an actual fair market value for data and that Cox is overcharging dramatically - in an environment where it has a government sanctioned monopoly.  This, I believe is wrong.

First, let's break down how I see pricing for data connections working into two groups: Fixed costs and Variable Costs.

Fixed Costs

The primary fixed cost for home network connections are the physical connection between the internet provider and the home and the other networking equipment required to make the connection.  This would be similar to the electrical lines that connect your home to the grid and the grid infrastructure necessary to transmit electricity from a generator to a home.  Similar to the power grid, there is not really an increase in cost here when a customer consumes more power unless the customer has specialized requirements that require some sort of upgrade - extremely uncommon.

Variable Costs

Internet service providers have to connect their users to the rest of the Internet.  Fundamentally, a residential internet service provider has one or more agreements with other internet service providers to ensure that any computer on the internet can talk to any other computer on the internet.  Cox, for instance likely has arrangements with companies such as Level3 and Cogent who can connect Cox to other service providers such as Comcast and Time Warner.  Internet companies like Google and Facebook also have agreements with the same providers (Level3 and Cogent).  Unlike residential providers, these providers must compete for business and companies who connect to them can distribute traffic across a variety of them, balancing it throughout the day to bring costs down.  These costs continue to shrink every year.

A more detailed writeup of these fixed and variable costs can be found at this very good Broadband Now article.

So, what is wrong with what Cox is doing?

Cox's pricing is here.  I find this pricing suspicious given that Google Fiber is able to provide substantially better service at a lower cost in Kansas City.  But, setting aside the fact that Cox charges more for less to all subscribers.  Let's look at what Cox charges to people who use over 1TB of data to, say, restore their computers using an Online backup service.  $10/50GB, or roughly $0.20/GB.  These costs should reflect only changes in variable costs, and not any fixed costs which are incurred in the over $70/month Cox charges customers after all fees.  Google, Microsoft, and Amazon charge between $0.087/GB and $0.12/GB - or roughly 1/2 the cost of what Cox charges, and these providers are including their fixed costs in these charges.  Using only 1GB of data in a month with Google/Amazon/Microsoft costs about a dime.  Using only 1GB of data in a month with Cox costs over $70.  Let's assume that this discrepancy is because Cox has higher fixed costs.  That means that those fixed costs are covered any any additional charges due to increased variable costs should be somewhere near market value for network transit fees.  Cox is charging at over 10x those fees (more on this later).

That's only a dime.  What does it matter?

If a home user wants to restore a 3 TB hard disk from a backup, Cox will charge that person an extra $15.  That's a lot of money just to restore a backup!  This pricing also will act as a mechanism to deter people from streaming video from Cox's competitors, Youtube, Netflix, etc.  Once you hit that cap, Cox will charge you $10 for every 16 hours of Netflix you watch in 4k.  Netflix charges $12/month for the ability to stream as much 4k video you want and that includes both the cost of paying licensing fees for the movies and paying their internet provider (cogent or level3) to ship the movie over the network.  That's just a couple of movies a week before you're paying more money to Cox than you are to Netflix!

How much should Cox charge then?

Here's what Amazon, Microsoft, and Google charge:

These providers, including Cox, are likely only paying less than $.01/GB as of 2011 in a market where prices have been falling consistently for decades.  Without transparency from Cox, it's hard to say what their bandwidth acquisition costs are, and Cox should be allowed to turn a profit.  But I believe anything more than $0.03/GB (billed in 1GB blocks, not 50GB blocks) highly suspicious.  Again, this reflect a 3x profit margin given what data I have been able to find already.  Cox is currently charging between $0.20 and $10.00/GB depending on how much of that 50GB block is used.

What should be done?

I would like for the Santa Barbara city council to publicly work with Cox to correct this abuse on its monopoly privileges in our community.  My personal feedback is that the city should take measures to allow changes in policies to our sanctioned monopolies re-open negotiations between the monopoly and the city to ensure that the tax-paying consumer is treated fairly.  I would hope other jurisdictions do so as well.

Git: Resetting a remote branch to a specific hash without a force push

I had a series of commits (including merges) on a branch that I wanted to roll back quickly.  I wasn't able to find any help for this problem that didn't either providing git a bunch of help navigating trees with the git revert -m command or using reset and a force push.  Here's a trick that's very similar to the reset strategy but retains all of the history:


> git reset --hard THE_HASH_YOU_WANT_TO_RETURN_TO # That's our good commit > git rebase -i origin/master # During the rebase, I squashed all but the top commit to make it one giant commit. # Gives us a single commit with all of the things that changed since the good commit. That commit was HASH_OF_ALL_CHANGES_SINCE_GOOD_COMMIT > git revert HASH_OF_ALL_CHANGES_SINCE_GOOD_COMMIT # That makes a negative commit of that one giant commit named REVERT_OF_ALL_CHANGES_SINCE_GOOD_COMMIT > git reset --hard origin/master # Back to reality > git cherry-pick REVERT_OF_ALL_CHANGES_SINCE_GOOD_COMMIT # applies a change that reverts all changes since THE_HASH_YOU_WANT_TO_RETURN_TO
After that, just push!