Automator

@gayleafcrime Very true, that’s a dimension I definitely overlooked in the OP. If anything, it would mean the ruling class granting less concessions and a more assertive defense of the capitalist order – dangerous possibilities. I foresee conservatives and right-populists taking that route, while liberals will opt for the “more superfluous jobs” idea and social democrats will opt for a universal basic income – all different tactics to maintain the same organizational structures. The basic premise of means of production ownership needs to be constantly questioned and assessed and pulled into the mainstream discourse. Dispossession can lead to fascism when influenced by the hard-right bourgeois elements, absolutely, but increasing left-populism is the flipside of that coin; if populists want an “us vs them” narrative in the rising tensions of late capitalist automation, let it be the people vs the ruling class, let it be the people vs the elites that dispossessed them.

It’s a complicated issue, and all I can recommend is doubling down on building class consciousness and pulling the ideas leftward. Help others peel back the facade and see the oppressive inner workings.

Open a File In An Instanced Program on OS X

One of the misconceptions about OS X is that it can’t run the same application more with than one instance. The way around this is to open the Terminal and write “open -n /Applications/yourapp.app” and it will launch a new instance of the app, even if it’s open. I made an Automator OS X Service to open selected video files in an instance of VLC and another for Maya 2009:

I find this helpful to check working assets and stock for building scenes in another instance of Maya. Anyway, grab the Automator actions here: For VLC and for Maya 2009. To install, unzip and put in ~/Library/Services/

If you want to change the application to use, just open the Service file in Automator and change the path to the app.

If you want to open a Maya 2011 instance with no input file, here’s a Service to do that. Update it for 2012 and on by changing the path text in Automator.

How to make & sell ebooks of your comics in like 12 1/2 seconds

even a dang baby can do it!

So you’re wondering if you should make ebooks of your comics.You should, but before I explain why, let me explain how simple it is to make an ebook of your comics. You can make one in like 2 ½ seconds. *NOTE: This tutorial is Mac-centric, sorry.

How do I make a PDF in like 2 ½ seconds?

Making a PDF on a Mac takes two steps. And you know what? I’m gonna cut one of those steps out for you. Download this app I made.

Literally just drag any group of image files onto that icon and it makes a pdf out of them. The pages will be in sequential order. So if you want to change the order of the pages you can just change the filenames. You can even batch rename them.

You can even make your own app if you want, it takes the same amount of time as that PDF took to make. Just open up the built the mac’s built in program Automator and then…

There ya go, that’s easy. Making a CBZ is just as easy.

Okay, how do I make a CBZ in like 2 ½ seconds?

CBZ is another comic-reading format that some people like. I dunno what the big draw is, but power users like ‘em. Here’s how you make ‘em:

A CBZ is literally just a zip file with all your images in them.

Two things to keep in mind:

First, Zero padding. Once again, your pages will appear in sequential order but it’s weird about numbers so instead of 1, 2, 3 you’re gonna want to name your files 001, 002, 003, etc. Once again, you can rename them in bulk.

Secondly, Mac problems: For some reason, ZIP files made with OSX’s built in archiver are sometimes wonky and won’t work for some people. Many have recommended zipping up your files with the confusingly-named WinZip Mac.

But if you want to class up those comics:

That’s how you make a file, but you’re gonna want to make your book look good! You could hypothetically just use the same web-sized images you output for your webcomic but make it pretty! Make sure every page is the same size! Leave a little margin around the panels, unless you’re going for full bleed. Add in a nice cover, maybe a title page, a contact info page in the back.

Your workflow will depend on your comics, but remember that anything you have to do the same way a whole bunch of times (resizing, cropping, placing, rotating, etc) you can record and replay a Photoshop action.

Export each of those suckers as a .jpg. You can use “save for web” in Photoshop to keep the file size down.*

There’s no industry standard resolution, but I tend to just Google “newest iPad resolution” and make mine the same resolution as the newest hottest model. Right now, that’s 2048 pixels on the largest side.

*NOTE: When you use Photoshop actions to resize in bulk, “image size” remembers the actual measurements you give it, whereas “save for web” just remembers the percentage. So use image size first to get the size right, then save for web to get the file size down or else you’ll end up with unmatching pages.

So what do I do with these files?

Sell those suckers! You can have ebooks for sale in like 10 seconds.

You’ll get a link you can share anywhere, and they handle everything. Taking payments, delivery, customer support,etc. Money just pops up in your account!

Why should you make ebooks available?

So your comic is free online. Why would anyone want to buy a pdf instead of just reading it free? What if you make it and nobody buys it? First of all, waaah. That took like 12 ½ seconds, total. If even one reader wants an ebook of your work, it’s worth the time investment. And let me tell you something: I buy PDFs all the time. I buy PDFs of comics available in their entirety free online all the time.

Here are some of the reasons I buy ebook editions of webcomics…

I need to catch up.

When I’m taking a break from work and find an awesome new comic with a massive backlog, I don’t have time to stop what I’m doing and read 1000 pages so that I understand the newest page. But if I can download a book that will catch me up and read it at my convenience, then I can pop that sucker in an RSS feed and follow along with everybody else!

It’s a better reading experience.

Look, your website is awesome for reading a page a day. But have you ever tried to read your own comic from the beginning? Scroll down, read, scroll down,find button, click, wait to load, scroll down, read, repeat. Times a million, it gets really annoying. It’s like watching a YouTube video that’s still buffering.

I read while traveling.

Dude, I spent huge chunks of my life on trains, planes, boats, in remote villages… and that’s where I read all my comics. I don’t got no internet! I load up on ebooks every time I travel!

I know what I’m getting into.

I like seeing a book and knowing “Ah, a 200 page story. Cool, I can get into that.” I know that if I buy that book, I can get a nice, curated chunk of story in one sitting and not have to worry about an impenetrable archive or starting to read something that’s never going to be finished, or is going to go on hiatus, or be interrupted by filler and apologies for late pages. I will impulse-buy an ebook much faster than I will impulse click “go to first page” on a webcomic.

I like to support artists.

I have bought digital versions of a comic that I’ve already read, and never bothered to even download them. Because I already read it for free, and I think it’s worth money! Just sending a donation feels creepy, I don’t want to buy physical merch because getting crap shipped to Korea means I’m giving money to the post office instead of the artist.

So give it a shot!

I recommend letting readers name their price, especially if your stuff is already free online. Everyone has different reasons for buying. Whether they wanna collect ‘em all, get caught up, show a comic they like to a friend, or just sneakily donate to their favorite artist- and they will be honest about what it’s worth to them. Lots of people will pay the minimum- but the generous people will more than make up for it. And people will be more apt to share and promote your comics to others if they know those people can name their price. But whatever! They’re your books, I don’t wanna get all up in your junk.

CALLING ALL SINGERS / SONGWRITERS / VOCALISTS / LYRICISTS!

The REGULARITY #61 (02/29/12)

Be a part of this track produced by the legendary DAN THE AUTOMATOR!

As you may or may not know - we are currently working with one of my favorite musicians/producers in music, Dan The Automator!

He took 30 to 40 pieces of hitRECord audio and made them into a seriously fantastic beat.  Now that beat needs to be made into a song - it needs vocals, your vocals!

So, who wants to jump on a track with Dan The Automator?

Plus - hitRECord is expanding so that our production company can meet the growth that’s knocking at our door — so, we hired a shiny new Team Member! Please welcome Mr. Matt Conley — hitRECord’s first Community Director! :oD

Thanks Again!

<3

J

==

Contribute the the collaboration here!

Bash script to convert eps files to png format

Just a quickie post, as I’ve been looking for that trick a little while, before solving it myself. Here, I’m programming an app for toddlers, and the material is extracted from a book (more about that at a later date). So, the book’s publisher gave us the source files, which were, of course, encapsulated postscripts, in the range of 10-20Mo (Man, that fills a dropbox quickly). That’s completely unusable, and I needed them in png.

Now, there are a lot of “Tools” available on the web to do that kind of conversion, and if you’re lucky enough to own a Mac, the Preview app does convert from eps to png in a breeze. Just open the file, save as PNG and boom, you’re done. Unfortunately, I’ve got 56 files and I’m too lazy to run them one by one, particularly as there is a scaling operation involved after. In comes ImageMagick, which is a pretty handy tool. If you haven’t done so, install the thing and come back here in a minute. Next thing, open the terminal, and the bash code is:

> cd myepsimagefolder
> for i in $(find . -maxdepth 1 type f -iname "*.eps";
> do
> convert ${i}/$(basename ${i} .eps).png;
> done

OK, that’s it, the entire *.eps content of your myepsimagefolder has been converted to png (a copy has been created, to be exact). Needless to say, it would work for pretty much any input or output format (just substitute the extensions for what you want in the second and fourth lines above).

Next up, rescale. Rather than going through a similar ImageMagick process, we can make use of OS X’s excellent Automator. Unfortunately, there aren’t any options that let you use automator to convert from eps, but after that any image manipulation is pretty easy. Here’s a view of the Automator script we run:

That’s all, save and hit “run” and all your png images have been scaled to 256x256pix, maintaining resolution, and placed in a subfolder (pngscaled in my case). Pretty cool, uh?

Enjoy yourselves!

Creating a Service Using Automator for nvALT Notes Version Control

Introduction

I’m going to get a pretty nerdy here for a moment.

So I dove in and I’m now using nvALT, Elements and Git for my note taking needs. Aside from a few minor hiccups, which I’ll address in a later post, this is really working nicely.

The one thing that I needed when I added nvALT and Elements to my note taking workflow was the ability to easily continue version control with Git. Before taking on this new process I was using Git in my notes directory and I wasn’t about to lose that option now.

But the ease which nvALT allows me to create new text files, and with nvALT built to work better with smaller files, I need some way to get my version control under control.1

Here’s what my workflow was:

  1. Work, notes, work, notes.
  2. Work, notes, work, notes.
  3. Look at the time.
  4. Damn! How long has it been since I last did a commit?
  5. cd to my notes directory.
  6. Commit my notes to my repository.

It wasn’t really precision based. Plus it had the added detriment of taking me out of what I was doing to commit my notes.

What I did2 was create a Bash script and then used Automator to create a Service for it that I also applied a keyboard shortcut to.

If you’re still with me, here’s how I did it:

The Reveal

First3, create your Bash script.

I keep all my notes in one directory. This is the way that nvALT and, it seems, Elements like to work. With some light taxonomy (ala Merlin Mann and Mac Power Users) I have a reasonably good system in place. All my notes are in a Dropbox subdirectory called “notes.”

To get this to work I created a Bash script named git_notes.sh and put this in it:

#!/bin/bash

cd /Users/username/Dropbox/notes/
git add .
git commit -m 'nvALT Service Commit'
echo "* "`date`" nvALT Commit" >> /Users/username/Dropbox/notes/noteCommits`date "+%Y%m%d"`.md

Now, you’ll see that this is redundant and probably a lot silly, but this is what it does.

  1. It makes sure that we'r in my notes directory using the absolute pathname.
  2. It stages only the modified files to the repository.4
  3. It commits those changes and uses a standard message for my Git log file.
  4. The last line is where it gets a little silly, it appends to another file in the same Dropbox subdirectory with a message that also gives a human readable date/time. Why am I doing this? I don’t know, maybe someday I’ll set up something to parse it to get some analytics of when I do most of my commits for my notes file.

But now that we have our Bash script. The rest is trivial.

First, we open Automator and select “Service” as our document type under the “Choose a type for your document:”

Next, we’re going to change the “Service receives” from the default “text” to “no input”. Leave the “in” “any application” as it is.

Penultimately, we click “Utilities” under the left-side “Library” dropdown and then drag the “Run Shell Script” from the middle column over to the right side.

Lastly, we enter in the location of the script in the workflow. It’s a good idea to use the absolute pathname here. In my case it was /Users/username/Dropbox/notes/git_notes.sh.

Once you save you’ll have a Service that will be available from any of your application menus. Click on the Service and it will do a Git commit of all the changes to that notes directory as well as update the faux log file we’ve created for the day.

But I don’t like to use the mouse/trackpad that much. So the final touch to this is to create a keyboard shortcut in your System Preferences. Go to Applications > System Preferences > Keyboard. Then choose “Keyboard Shortcuts” and select “Services” from the left side. Your new Service should be at the end of the Services listings. Just click the blank space at the far end of the window and you should get a text input field. You can use anything you want for your shortcut, but I chose control-option-command-shift-s for mine to avoid any chance of a keyboard conflict.

Conclusion

So there you have it. A quick and dirty way to make sure that the notes you’re producing will be version controlled through Git. As a final remark, I’ll say that having version control has already paid off for me.

I use multiple computers and I made the huge mistake of accidentally deleting a bunch of files when a prompt popped up and I didn’t read exactly what it said. All of a sudden 20-some-odd notes of mine were gone. But version control to the rescue! I knew I had just committed to my repo before I had deleted the files and I was able to pull them back from brink of deletion hell. But this was even easier than it could have been because I had been using git add . instead of git add -a. All it meant was that I needed to unstage the deleted files. It was great and a perfect reason why doing something like this makes sense.

Post Script

I should add that the use of a common/standard commit message is bad practice. The reason I do this, instead of throwing a prompt so that I can enter in a more detailed message, is that this is supposed to create a workflow that won’t interrupt what you’re in the middle of but give you the peace of mind that you’ve got things in a version controlled environment. This does not prevent you from going to your notes directory and doing a proper Git commit with a detailed message on what you’ve done since your last commit. In fact, at the end of this paragraph I’ll be committing properly to say that this draft is finished. And when I finish my review of the draft I’ll commit again, and message that it is ready for posting. After that, I’ll probably do a name change to the file (this is the taxonomy thing I mentioned before) and then do another proper commit.

One last point: You may have noticed that you can easily change the Bash script to point to any directory you want. Once you have the Service in place, you can edit your .sh file whenever you please and have a temporary keyboard shortcut for Git repo commits. I think that’s kinda cool.


  1. For example, I have already committed this post three times. 

  2. My solution was to produce a quick and dirty Bash script and then have it run through launchctl on a regular basis. 

  3. This tutorial assumes you have a local Git repo in place. If you don’t have one and want to learn how, you can check out my post on the subject of Git. 

  4. This is a key point because what I am not doing is staging any files for deletion only modification. This means I won’t worry about any files going away without my knowing. 

"Search With Google" Using Chrome

About a month back I tweeted:

people of the Internet: how do I make Search With Google (Command-Shift-L) use Chrome (my default browser) instead of Safari?

Mark Rowe informed me:

“Search With Google” is a service provided by Safari.app. You’d need to create a new service and bind it to that key equivalent.

Working from Mark’s suggestion, I first disabled Safari’s Service by unchecking System Preferences > Keyboard > Keyboard Shortcuts > Services > Searching > Search With Google.

Then I launched Automator and created a new Service. I added a single Run Shell Script action like so:

External image

Here’s the text of the Ruby script to save you from having to retype it:

require 'cgi'
`open 'http://www.google.com/search?q=#{CGI.escape(STDIN.read.chomp)}'`

When you save your Automator workflow, Automator will ask you for a name for your service. I’ve unimaginatively named mine “Search With Google (Wolf)” to indicate it’s pretty much just like Safari’s built-in service, but tweaked by me.

Automator will automatically save your workflow into ~/Library/Services, but it’s up to you to assign your abandoned Command-Shift-L to your newly created variant.

To do so, find your new Service in System Preferences > Keyboard > Keyboard Shortcuts > Services > Text. Then double-click the column to the right of your Service and type Command-Shift-L set your shortcut.

Mac tip: synchronizing folders between Macs
External image

Since MobileMe will sunset eventually (2012), I started to look for a replacement for iDisk. I used to use it as my Home directory (except for sensitive information, since there’s no encryption) because I could have the same documents in all of my Macs, plus access them from any iOS device.

Now that the party has a date to end, I needed an alternative. For non-sensitive documents (public technical documents, public domain sheet music and guitar tabs), I’m using Dropbox. I can access it from GoodReader on the iPad/iPhone and keep it synchronized across all my Macs as well, so it fits for PDFs I need on the road. For OmniFocus, I’m using their new OmniSync beta service. That leaves only other documents in the open.

Keep reading

Scheduled Automator Backups

The Problem

I recently setup Backblaze for backing my computer up to remote storage. I have Time Machine setup on my machine, but that backup goes to a local drive. That is great for recovering files that I mess up along the way, or restoring my system if I kill it while experimenting with something in the terminal, but in the event of theft, fire, etc. the time machine drive goes with the computer and I’ll have lost my data.

For the price, I couldn’t find anything that beat Backblaze as a backup solution. There is no limit to the amount of data they will store and by paying for two years up front, it worked out to less than $4/month. There is a minor catch though… there are a few things Backblaze won’t backup. The big one being the /Applications directory. Applications can be reinstalled, so I am ok with that. It is backing up all of my documents, music, movies, photos and all of the things I would be concerned with after a catastrophic event.

So the problem is this: I plan to use PHP and MySQL to work on some personal projects in the near future, so I have MAMP setup to run Apache and MySQL on my machine. MAMP’s root directory for Apache is in a directory that resides in a path under the /Applications/MAMP directory. So as it stands, Backblaze won’t back up those files. Right now that isn’t a problem, but I didn’t want to wait until it was to come up with a solution1.

The Solution

To solve this, I decided to use Automator to create a scheduled task to copy my htdocs directory onto an external drive2. I fired up Automator, created an iCal Alarm project, added the two required items to the workflow and saved it. It automatically launched an iCal event and I scheduled the backup to run daily.

The Problem with the Solution

Everything worked the way I expected it to and I found a need to run the same process and include another directory in my backup. This is where I got stuck. This might be my lack of knowledge and a wrong turn or two in my google searches, but I couldn’t find a way to access the workflow that had been added to my calendar to modify it. I found some mentions of a directory where it should have been stored, but it appears that this changed with Lion and I couldn’t find it anywhere, so it was back to the drawing board.

The Solution to the New Problem

My first thought was to scrap the original, create the workflow again and move on with my life, but that didn’t last long. It occurred to me that I would be less likely to add important items to my backup plan if I had to redo the work each time. That is a recipe for disaster, so I did a little digging around in Automator and found that you can run Automator workflows from other Automator workflows. So I quickly did the following:

  1. Created a new workflow as an Application
  2. Setup the same items that I had in my original workflow
    • Get Specified Finder Items
    • Copy Finder Items
  3. Added the directories I wanted to have backed up
  4. Saved the application to my ~/Scripts folder3
  5. Created another iCal Alarm workflow
  6. Added a “Run Workflow” action to it and pointed to my backup workflow
  7. Saved it and setup the daily schedule when prompted

Problem solved! And now, I can update the backup workflow whenever I have a need and the scheduled task will just run the updated version.

The Disclaimer

I got my mac about 4 months ago as of the day I’m writing this, so I am pretty green when it comes to some of the OSX specific stuff like Automator. There is a chance that I am doing stupid things or creating more work for myself. If you are a seasoned mac user and think I’m an idiot, please feel free to point that out in the comments, I’d be happy to learn.


  1. There may be a way to change MAMP’s default for this and I will look into that in the future, but this provided an opportunity for me to play with Automator and gave me something to write about. If I end up doing this in the future, I’ll post a quick how-to post on that as well. 

  2. Backblaze will backup connected external drives too, as long as they aren’t time machine volumes. I partitioned my external 1TB drive to create a volume that backblaze would backup. 

  3. This is a folder I created to store Automator workflows, shell scripts, etc. 

Since I upgraded to Lion on my Mac there have been a few things not working like they used to… One of them is iSync that quietly have been deleted, and so have Frontrow - iSync was easy to just copy from my Snow Leopard backup and drop on it’s usual spot in /Applications folder, and Frontrow I never really used anyway (but maybe soon I will take a closer look on Plex, just to see if Im missing out on something).

Today I found yet another small thing I used to use a LOT - sending files to my Bluetooth devices!

It’s for some reason no longer there… Maybe because iOS is supposed to get WiFi sync soon? Anyway, I don’t have an iOS device so I want my BT sending back! Spend some time with El Goog and found two links to my solution:

The missing link in Automator

How to build it in Automator

As seen in the picture above, tadaa thats how it should look! :)

Multithreaded Image Conversion and rar/unrar Python 3.2 Scripts and Automator Services

So here are the downloads for the multithreaded Python image converters and multithreaded rar/unrar scripts I made yesterday. You’ll need Python 3.2 to run them, since they rely on the concurrent.futures module from 3.2. Grab the installer here: http://www.python.org/getit/releases/3.2/. The Mac installer doesn’t overwrite your existing 2.7.1 Python install so you invoke the new python by typing “python3” in the Terminal.

The .py scripts:

sips_PNG This uses sips in OS X for conversion, so if you’re looking to use it in Linux, you’ll probably want to rewrite the last line of def sipper to use Imagemagick’s convert or something built-in.

unrar_threaded This should work in Linux without modification.

rar_threaded This should work in Linux without modification.

How to run the scripts: these take a batch of paths to items from standard input and then process them, so the easiest way to run them is to type “find /path/to/images/ | python3 path/to/sips_threaded.py” or “find /path/to/whattocompress | python3 path/to/rar_threaded.py”

For OS X only – Automator workflows

First, the sips conversion workflow files (PNG, TIF, TGA, PSD, JPG). If you just want something to make image conversion fast and easy in OS X, I made Automator services for OS X 10.7 (might work in 10.6) that don’t depend on the terminal or the scripts:

Grab them here: Download here. The only catch is that these Automator workflows require that you make Python 3.2 the default version run when typing “python” in the terminal. That’s done by writing this in the terminal:

sudo cp /usr/local/bin/python3 /usr/bin/python

The good news is that this isn’t destructive since /usr/bin/python was just a duplicate of /usr/bin/python2.7. /usr/bin/ still has Python 2.7, 2.6, and 2.5 from the default OS X install, so I can still use older versions of those by typing “python2.7”.

rar/unrar multithreaded:

Download

2

Uploading a File With One Keystroke

Starring: Quicksilver, Automator, and Transmit

  1. Set up a Service in Automator that applies to files and folders in the Finder. Make the Service pass the files it’s given into Transmit’s “Upload” Service. Select a website or Transmit Favorite to upload the files to. Save your new Service. Now you can right-click on files and go “Upload to $server_name”. That’s a good start, but let’s bam it up a notch.
  2. In Quicksilver, enable Proxy Objects and create a new Trigger that takes the current Finder selection and uses it as the input to the Service you just made. Bind the Trigger to a Hot Key. Now you’ve got access to that Service through a keyboard shortcut.

I press ⌥⇧⌃U and the file I’ve selected in the Finder is uploaded to my server.

My only problem with this workflow is that it causes Transmit to become the active application. If you know of a way to hide Transmit while the file’s uploading, do tell.