Ricoh GR shooting tip of the day, #1

(this probably won’t be a regular series I’m just bad at titles)

The Ricoh GR mode dialHere it is: Give up on Auto mode!   If you have a Ricoh GR, take advantage of the MY[1-3] settings.  You’ll get a much more satisfying result.  I’ve had my next-gen APS-C sized Ricoh for about a year now (migrating from my wonderful but broken Ricoh GR Digital III) and every once and a while I’ll slip the mode wheel into Auto, the green camera icon, thinking that it’ll make my life a little easier, or that the camera’s software will know better than I do about what my intention is.  It never works.  I’ll just have to live with the fact that I know better than my camera.

On that note I don’t want to say “hey if you’ve got a Ricoh GR, you should learn to juggle your aperture and shutter anyway” but there it is.  I said it.

My big tip?  Get cozy with TAv mode.

Auto will try to balance all three exposure variables, aperture, shutter speed, and ISO, which usually ends up creating blurry or mediocre pictures.   TAv mode allows you to set your shutter and aperture, but handles the ISO for you.  This is wildly helpful if you’re realistic about your lighting conditions.

Here’s what I do, personally.  MY1 is based off of TAv mode, with the high-contrast Black and White filter applied and a few other personalized tweaks.  MY2 is also TAv mode, but saves colour pictures in a pure RAW format (again with a few personalized menu tweaks).  MY3 is “Instagram” mode, it’ll save pictures in a 1:1 formatted large jpeg file with a date stamp.

So take 20 minutes and set your MY[1-3] settings and you will have a much more enjoyable time shooting with your GR.

BONUS TIP: Also, re-assign that “Effect” button on the left hand side.  That button is just silly.  But the placement can be very useful for other quick features like cropping to 35mm/47mm focal lengths.

 

Instagram Mode
What Instagram Mode may look like

 


 

Cabin owner beats back Northern Saskatchewan forest fire

Really awesome story via the CBC.  Mark Paquette managed to save his cabin from a forest fire by rigging up an industrial grade sprinkler system.

Some interesting extra details from the comments section (which, for once, are relatively crazy free.  Relatively..)

sprinklerhelp: @blahblahblah He used 2″ pipe creating a 200 x 200 perimeter around the property. Coming off the 2″ pipe is 3/4″ pipes running straight up 10ft with industrial brass impactor head sprinklers on top in 10 locations. The sprinklers shoot a 50ft radius. Not sure the exact pump used but i believe it pumps like 120 gallons a minute? riged up a fueling system so it would run or up to 18 hours….? About all i can remember…. i helped get the pieces together to build the system (how i know), he designed/ planned and installed himself. Worked like a charm apparently.. hope that helps with your inquiry.

Awesome work.

There’s so much going on with these Hacking Team e-mails #bitcoin #hackedteam

Via Reddit, apparently members of Hacking Team ran into Mark Karpeles in Tokyo?  Small world, but there doesn’t seem to be a lot of love there. Anyway as it turns out Hacking Team had a tool to copy Bitcoin wallets from end users. Not a big surprise.

 


You can get more insight into the Hacking Team e-mails at Wikileaks without having to download the entire massive archive. https://wikileaks.org/hackingteam/emails/

Creating a custom mini-queue for Laravel 4.2

Let me preface by saying that there’s a better way to do this in Laravel 5.x.  I personally wasn’t totally onboard with the version of this functionality in Laravel 4.2 so I whipped up yet another version of my own.

Because it’s Laravel, it actually took me longer to write this blog post than to get this bit of code working. High-5.

I had a series of very lightweight API calls to make to a 3rd party endpoint.  The reliability of this endpoint is pretty good, but things happen.  So I wanted to queue calls to this endpoint so that in the event of trouble, I wouldn’t lose the call.  Pretty straight-forward right?  So here’s what I came up with;

I created a Job model as follows


class Job extends Eloquent
{
	protected $table = 'jobs';
	public $timestamps = true;

	protected $fillable = array(
						'function',
						'data');

	public function run()
	{
		$result = $this->macro( $this->function, json_decode($this->data) );

		if( empty($result) ) {
			$error = 'Job ID '.$this->id.' executing '.$this->function.' failed to run'; 
			Log::error($error);
			Helper::mailError($error);
			$this->retry++;
		} 
		else {
			$this->imported = true; 
		}

		$this->save();
	}

	/*
		Any function that can run as a macro must return false on failure
	*/
	public function macro($name,$data)
	{
		switch ($name) {
			case 'widget_submit':
				$widget = new WidgetSender(); 
				return $widget->WidgetSubmit($data);
				break;
			
			default:
				Log::error('Job ID '.$this->id.' executing '.$this->function.' not found');
				break;
		}
	}

}

Above you’ll see a “macro” function. You don’t really have to do it this way and it has the potential to get pretty messy, but my rule is any job that gets put into a macro must be made as simple and modular as possible. Handle the ugly stuff elsewhere.

Using php artisan migrate:make jobs I created a database table under the up() function


		Schema::create('jobs', function($table)
		{
			$table->increments('id');
			$table->string('function');
			$table->binary('data')->nullable();
			$table->timestamps();
			$table->string('imported')->nullable();
			$table->string('retry')->nullable();
		});

To call this code, I use the following route and controller

Route:


// Run a generic job, for jobs that happen rapidly (once per hr?)
Route::get('/jobrun/{id?}', array('as'=>'jobRun','uses'=>'CronController@jobRun'));

Using this route code with the controller below, if I just run “/jobrun” it’ll fire off all of the jobs in the jobs table that have a null “imported” field. If I specify a job ID (which will be the ID assigned to it in the database) it will re-run that one individual job, which can be helpful for troubleshooting.

Code within my chosen Controller:


	public function jobRun($id=null)
	{
		$count = 0;

		if( empty($id) )
		{
			$jobs = Job::whereNull('imported')->get();		

			foreach ($jobs as $job) {
				$job->run();
				$count++;
			}
		}
		else
		{
			$job = Job::find($id)->get();
			$job->run();
			$count++;
		}

		echo "SUCCESS: ".$count." jobs run - check your e-mail for results";
		exit();
	}

When I want to queue up a job in my code, I do the following


	$job = new Job;
	$job->function = 'inbox_submit';
	$job->data = json_encode($member);
	$job->save(); 

This code is probably not suitable for high-traffic sites or more complicated operations. If you're in that situation in 4.2 it'll be better to read up on Redis or one of the other technologies Laravel 4 uses to manage Queues with it's built-in methods.

If you have a blog that has a lot of code examples

If you have a blog that has lots of code examples, check out wp-code-highlightjs, which is based on the awesome highlightjs. It has a boatload of options, it supports 125 languages and 63 styles.

It allows you to add prettified code blocks to your blog posts like so


10 PRINT "Hello World"
20 GOTO 10

Or


#!/bin/bash

COMPRESSOR=`which bzip2`; ## You can change this to gzip if you want, or zip etc.
WORKDIR='/tmp';
ARCHIVEDIR='/home/scarr/00_ARCHIVE/sqldump';
MONTHLYDIR='/home/scarr/00_ARCHIVE/sqldump/monthly'; ## Where we're putting monthly backups
MYSQLQUERY="mysql -u$USER -p$PASS";
MYSQLDUMP="mysqldump --add-drop-table --allow-keywords -u$USER -p$PASS";
GREPEXCLUDE="Database|somedatabase|information_schema"; ## We can exclude databases here, as long as the first argument is a string that says "Database"
PRUNEDAYS="14"; ## The number of days of DB's we'd like to keep. 
##END VARIABLES

for i in `echo "show databases" | $MYSQLQUERY | grep -E -v "$GREPEXCLUDE"`;
do
	echo "dumping $i, and compressing $i-`/bin/date +%m%d%Y`.sql using $COMPRESSOR";
	$MYSQLDUMP $i > $WORKDIR/$i-`/bin/date +%m%d%Y`.sql;
	$COMPRESSOR $WORKDIR/$i-`/bin/date +%m%d%Y`.sql;
done

## We compress the databases into a TAR archive
cd $WORKDIR && tar cvf `hostname`-`/bin/date +%m%d%Y`.sqlarchive.tar *-`/bin/date +%m%d%Y`.sql.*

## Delete the leftover SQL files
rm $WORKDIR/*-`/bin/date +%m%d%Y`.sql.*

## Move the final product to the archive
#-first with the SCP
#-finally move it to the local archive
mv $WORKDIR/`hostname`-`/bin/date +%m%d%Y`.sqlarchive.tar $ARCHIVEDIR/.;

## If it's the first of the month we'll archive a copy of the DB to $MONTHLY
FIRST_THIS_MONTH=$(date --date="this month" +%b-01-%Y)
RIGHT_NOW=$(date +%b-%d-%Y)

if [ $RIGHT_NOW == $FIRST_THIS_MONTH ]; 
	then
	cp $ARCHIVEDIR/`hostname`-`/bin/date +%m%d%Y`.sqlarchive.tar $ARCHIVEDIR/$MONTHLYDIR/.;
fi	

## Optionally we could prune the archivedir here.
if [ -e $ARCHIVEDIR/`hostname`-`/bin/date --date="$PRUNEDAYS days ago" +%m%d%Y`.sqlarchive.tar ]
then
	echo "pruning $ARCHIVEDIR/`hostname`-`/bin/date --date=\"$PRUNEDAYS days ago\" +%m%d%Y`.sqlarchive.tar";
	rm $ARCHIVEDIR/`hostname`-`/bin/date --date="$PRUNEDAYS days ago" +%m%d%Y`.sqlarchive.tar
fi