Barcelona Ruby Conference 2013

On September 14th-15th I attended the Barcelona Ruby Conference 2013. This conference was organised by Codegram Technologies at the AXA Auditorium in the Illa Diagonal shopping centre in Barcelona and was attended by around 300-400 people. I had attended several conferences before but this was my first Ruby conference, so I was looking forward to seeing what the Ruby community had to offer.

Barcelona is a wonderful city to visit and therefore made a great conference location. The weather was hot on the Saturday but then rained on the Sunday. The conference itself was a single-track conference with all talks held in a single auditorium, where this had a large stage for speakers and a big projection screen behind the stage. The two days of talks for the conference both ran from 9am until after 6pm, with a dense schedule and lots of well-known speakers from the Ruby world. Both days had an evening party, where Saturday night’s party was held on the beach until late. Lunch and coffee were provided on both days by some of the many conference sponsors.

The opening day of the conference was scheduled to start at 9am, but large registration queues formed outside and it eventually started 30 minutes late. Whilst waiting in the auditorium for the introduction to start, it was clear that the audio/visual production quality of this conference was high. The conference had a space theme, so NASA-style space exploration announcements were played in a loop over the PA system before the introduction began. A great intro video was then played on screen, where this was in the style of a Hollywood action movie trailer, where a single bad Git commit caused the global internet to fail catastrophically.

The Master of Ceremonies was Dr Nick (I didn’t catch his surname) and he came on stage dressed in a full astronaut costume. He introduced the conference with great enthusiasm, telling lots of jokes and giving the standard conference attendee announcements. He also announced that the astronaut costume he was wearing was being raffled off at 5 euros per ticket. Dr Nick acted as an MC for the entire two days program and also coordinated the Q&A sessions after each talk. The opening keynote speaker was then Ruby creator Yukihiro Matz with “Changing your World”. Matz talked for 50 minutes on the Ruby community, how he learned about coding from examining the Emacs source code and how he created Ruby. He made the statement that “language development is about programming the minds of programmers” and also discussed future plans for the standard MRI implementation of Ruby.

The second speaker was GitHub engineer Vicent Marti on “Once upon a time, Ruby”. He presented three problems encountered in hosting large Ruby applications at GitHub, each of which was told in the style of a children’s fairy tale. The first of these stories was on MRI mark-and-sweep garbage collection and its difficulties in managing Ruby programs interacting with C libraries. His second example was on timeouts and how to manage them. Vicent Marti‘s final example then encouraged more people to use JRuby and Rubinius instead of MRI Ruby, as he believed that MRI Ruby has too much legacy code for future use as the reference Ruby implementation. He finished with a plea to gem creators that they should test their libraries against JRuby and Rubinius for correct functionality.

The next speaker was then Chris Kelly on “Rabbit hole: garbage collection and Ruby’s future”. This was a very low-level talk with lots of C code showing how garbage collection works in MRI Ruby, in order to communicate that the MRI Ruby garbage collection system is a non-trivial piece of Ruby infrastructure. He described how C functions such as gc_mark() and gc_sweep() are used in mark-and-sweep garbage collection, and stated that this C code was “macros all the way down”.

The final speaker before lunch was then Matt Wynne on “Treating objects like people”. He began with the statement that, despite him being an OO programmer for many years, he recently realised that he’d been using Ruby objects wrong. He showed the existing source code for the Cucumber BDD framework and stated that he felt using a data-centric domain model for it was a mistake in retrospect. He gave a brief history of OO programming languages, from SIMULA-67 to Smalltalk-80 and C++, and recommended both the GOOS and Object Thinking books for a better understanding of OO programming. He then showed the source code for Cucumber 2.0, where this was a rewrite of the library to use a more immutable, functional style.

A packed lunch was then provided and attendees were able to sit and enjoy the hot Barcelona weather in a park outside the conference venue. The first talk after lunch was “Rules” by Sandi Metz, author of Practical Object-Oriented Design in Ruby. She introduced 5 rules for better code quality, including short classes and short methods. She conveyed her theme of rules by presenting fascinating asides from social science research on the tendency for people to be law followers or law breakers. This included research examples on people crossing a series of pedestrian crossings in the fastest times possible and competing players drawing resources from a shared regenerating pool when resource renewal limits existed. She made the point that turn-based collaborative games can lead to reciprocal rule breaking, but that humans tend towards collaborative behaviour.

The next talk was then Jeremy Walker on “Refactor your productivity”, where this did not contain any code but gave many life tips on becoming a more productive programmer. He recommended the Coffitivity background noise app for those used to coding in coffee shops. He also presented a slide that simply said “Vim”. There was an expectant pause from the audience, then a mixture of 70/30 cheers and boos as he revealed the next slide that said “Use Vim”. He did present a following slide, however, which conceded that Emacs is also available. He finished off with fascinating diagrams of spider webs after those spiders had been fed flies injected with cannabis, sleeping pills or caffeine, thus showing the effects of these drugs.

Next up came Heroku employee Richard Schneeman with “Millions of apps: what we’ve learned”. He gave deployment tips such as never putting secret keys or any kind of login credentials into version control, but reading them from environment variables. He also advised to aim for development/production parity, where having different database implementations in development and production is a bad idea. He added that there are no performance problems in a software application, only visibility problems.

The following talk was then Ruby Tapas presenter Avdi Grimm on “You gotta try this”. He presented his Naught Ruby library for building Null Object classes and also showed great enthusiasm when describing his views on both the teaching and learning benefits of pair programming. He also showed his www.pairprogramwith.me site and #pairwithme Twitter hashtag for encouraging pair programming, especially remote pairing.

The final talk of the Baruco opening day was then Corey Haines on “Design patterns and the proper cultivation thereof”. He described the origins of design patterns, including the c2 Wiki where the original design pattern discussions took place. He then discussed the correct times to use design patterns, whether they were still relevant in modern software development and recommended Refactoring to Patterns as a good book for the practical use of design patterns.

The opening talk on Sunday was a second keynote “Hunters and Gatherers” by Paolo Perrotta. As with all keynotes, this was more abstract than solely presenting code, where he discussed the evolution of programming languages. As a metaphor for this evolution, Paolo Perrotta discussed how mankind learned to calculate longitude, which he described as one of the greatest scientific achievements of the human race. He then described the introduction of new programming languages as analogous to unexpected evolutionary adaptations, and compared the popularity decline and deaths of some programming languages as “evolutionary dead ends”, where he gave the SOAP protocol as an example of this. In line with the “Hunters and gatherers” title, he explained that programming language designers with strong opinions and who act single-mindedly are hunters, whereas those who build and incrementally improve on existing languages are gatherers. He finished by advising a strategy of “strong opinions, weakly held” when developing new programming languages and frameworks.

The next talk was JRuby creator Charles Nutter on “The future of JRuby”. He gave a low-level technical talk on current and forthcoming features of JRuby, including the new invokedynamic feature that has helped to improve the performance of JRuby. He mentioned the Graal Oracle API to a Native JIT compiler and the Truffle library for working with Graal. He then gave a code example from the jo gem for implementing goroutines from the go language in Ruby, and mentioned a recent Google Summer of Code project that aimed to optimise JRuby startup times.

Next up was Bryan Helmkamp on “Building a culture of quality”, where he discussed improving the quality of code created by a software organisation, especially when starting from a low level of existing code quality. He advocated introducing process changes as experiments, so the positive results from these experiments could be used as justifications of the process changes for sceptical/reluctant team members. He also suggested to introduce a culture of lunchtime brown bag sessions for knowledge sharing, where this could involve watching screencasts or discussing interesting blog posts.

The final talk before lunch was then “iOS Games with RubyMotion” by Brian Sam-Bodden. He began with a controversial quote from the RubyMotion creator that “We do not believe that Xcode makes a good environment for Ruby development (or development in general)”. He then mentioned the bubble-wrap and motion_model gems for RubyMotion and gave the statistic that games make up 60% of paid apps on iPhone and 50% on iPad. He introduced the joybox gem for RubyMotion game creation with a built-in physics engine and presented his two RubyMotion games rm-tetris and rm-super-koalio. He gave a brave and ambitious coding demo on these two games, including a sequence of 13 Git commits to show incremental progress on developing his rm-super-koalio game in RubyMotion. This was a great talk on the power of RubyMotion in creating iOS games.

After lunch came “Yak shaving is best shaving” by Aaron Patterson. Aaron Patterson is a very funny and entertaining speaker, as well as a well-known and well-respected Rubyist. He presented a simple bug fix that led to him delving deeper into the underlying Ruby and C code for the library out of curiosity. He questioned a statement on performance made in a comment for that section of code and chose to also investigate that assumption. This was to make the point that sometimes investigating deeply into source code and allowing yourself to be distracted by your own curiosity can improve your software knowledge.

The next talk was “A Rubyist in Clojure-land” by RSpec creator David Chelimsky. He discussed using functional idioms from Clojure in Ruby code, especially chaining together functions to form a pipeline. He also compared a few different expectation libraries that could be used in RSpec assertions.

Following him came Katrina Owen with “Freeloaders”, where the slides for this talk were styled as a 1980s computer game. She discussed fixing a bug in some terrible Ruby code, where she gained XP (Expletive Points) based on the poor quality of the code. She then discussed the team behaviour that leads to such poor code and introduced a Game Theory analogy to explain it. She made the intriguing comparison between individual developers on a team cutting corners to produce bad code and the Prisoner’s Dilemma, where developers can either collaborate (write clean code) or cheat (hack together quick, bad code to implement features). This was an analogy that I’d never considered before and which presents the problem in a different light.

The next talk was then Reg Braithwaite on “What developing with Ruby can teach us about developing Ruby”. He began by referencing the idea of the Infinite Monkey Theorem and stated that the main problem in software development is separating very good code from the rest. He continued on this theme with the assertion that searching for changes in programming language syntax or faster languages is merely desiring faster typewriters. He finished off with the belief that the two most important things for Ruby to concentrate on are making it easy to write tests and encouraging collaborative software developing through social coding such as GitHub. This was a thought-provoking talk exploring aspects of Ruby language culture and introducing ideas from social sciences, not a lower level coding talk.

This reached the end of the main talks, although there were then 8 lightning talks of 5 minutes each. These covered a variety of topics, the best of which was from Anatoli Makarevich. He’d watched Sandi Metz‘s “Rules” talk from the previous day and written the sandi_meter library overnight to test whether a specified code base adhered to those rules. This bought the final day of the conference to a close.

Overall I felt that this was a great conference. The speaker list included lots of high-quality Rubyists and covered a wide range of topics. Some talks such as Charles Nutter‘s and Chris Kelly‘s were very low-level and included lots of core Ruby implementation code. Other talks like Reg Braithwaite‘s, Jeremy Walker‘s, Sandi Metz‘s and Paolo Perrotta‘s were much more abstract, covering non-software topics as thought-provoking asides. Some talks were given by very confident, funny and entertaining speakers, especially Aaron Patterson. My favourite talk of the conference was “Freeloaders” by Katrina Owen

.
My only criticism of this conference was the venue wifi, which was very poor. A large part of a modern software conference is interactions and discussions on Twitter from the attendees. This has the benefit of generating discussions between the attendees in real time and with other people not present at the conference. The conference wifi was very poor, however, making it difficult to send tweets or browse the internet at all. A conference with 300-400 geeks attending, most of them bringing several computing devices, is a non-trivial problem and would act as a stress test for many wifi networks. The wifi was simply not good enough though, and the conference organisers knew in advance how many people would be in attendance.

This was my only criticism of an otherwise great conference and I would attend again in a future year. The conference attendees were friendly and this was an entertaining conference as well as an educational one. The mark of a good conference is that a person leaves feeling that they have lots of new tools and libraries that they need to learn and new books that they need to read. I definitely felt that way leaving this conference and there were many informative talks on new libraries or techniques to try in my day job. Congratulations to Codegram and all of their volunteers for a well-organised and successful conference.

C# console application runner

An important part of a comprehensive automated test suite for an application is end-to-end tests. These are tests that interact with the application in the same way that the end user would do, i.e. from the user interface. These end-to-end tests would then record the program output and assert that it matches the output intended by the application creator.

For a console application, an end-to-end test would involve entering input at the command line and then recording the command-line output. In C#, this could be done by running the application from the Windows Command Prompt and redirecting standard input and standard output, as with a Unix terminal. In an automated test suite, this would require a script to run the console application and store the program output in a text file. This script could be called from a test framework such as NUnit, the test output file read and then assertions made against the output file contents.

To simplify this process, I have created a C# runner for console applications:

C# console application runner

This C# console application runner was written as a Visual Studio 2012 solution, so cannot be opened in earlier versions of Visual Studio. All of the C# code for the console application runner is contained in a single file, however, so this code could be copied and pasted into earlier versions of Visual Studio:

Application runner C# code

The application runner is initialised with the details of the application to run. It is then supplied input lines to pass to the console application and the application output is then returned. Test assertions can then be made against the application output, in order to ensure that the console application is working as intended.

Here is a simple example of using the console application runner in end-to-end tests. I have created an console application called “ConsoleCalculator”. The name of the class containing the Main() method in the application is “CalculatorProgram”. The ConsoleCalculator application accepts string inputs for mathematical sum questions, then returns the answer to the question. The test below is for addition and is written with NUnit, but other C# testing frameworks could be used instead. The ConsoleApplicationRunner class accepts two constructor arguments for the type of the console application class and the name of application executable file (minus the “.exe” file extension):

Console Application Runner Code

Remote Desktop Connection to a Raspberry Pi

I’ve had a Raspberry Pi since May 2012, when the first wave arrived. It’s a great device, but requires a display if not only connected to by SSH over a network. The Raspberry Pi has a HDMI output port, allowing it to be connected to a HDMI TV for display. It’s inconvenient to carry a TV around every time I wish to use my Raspberry Pi though, so I usually connect to it using a Remote Desktop connection from my Fedora 18 Linux laptop. This post gives instructions as to how I’ve set up this remote desktop connection.

Pre-requisites

You’ll need:

  • A Raspberry Pi with the Raspbian OS installed on the SD card and a user account with password
  • A monitor and HDMI cable to display the Raspberry Pi output
  • A laptop to connect to the Pi
  • A Remote Desktop client installed on the laptop
  • An ethernet cable to connect from the laptop to the Raspberry Pi
  • A VNC server already installed on the Raspberry Pi. To install a VNC server, run “sudo apt-get install tightvncserver” from a terminal on the Raspberry Pi whilst connected to a network. Then run “vncserver” from a terminal in the Raspberry Pi, where this will prompt you to set up a password for connecting to the Raspberry Pi with VNC.

Instructions

1) Connect your Raspberry Pi to the monitor, boot the Raspberry Pi and log in.
2) Open up a terminal and change directory to /etc/network (cd /etc/network)
3) First of all you’ll need to give the Raspberry Pi a fixed IP address. This is done by changing the /etc/network/interfaces file
Before changing any configuration file in /etc, you should always back up the original! Therefore copy /etc/network/interfaces to /etc/network/interfaces_old (cp /etc/network/interfaces /etc/network/interfaces_old). This way, if anything goes wrong then you can simply restore the original /etc/network/interfaces file (mv /etc/network/interfaces_old /etc/network/interfaces).
Change the contents of /etc/network/interfaces to:

auto lo

iface lo inet loopback
iface eth0 inet static
    address 192.168.200.100
    network 192.168.100.0
    netmask 255.255.255.0
    broadcast 192.168.200.25

This gives the Raspberry Pi a fixed IP address of 192.168.200.100, which you’ll need for connecting to it by remote desktop.

4) Now create a new file on the Raspberry Pi at /etc/init.d/tightvncserver and give it permissions of 777 (sudo chmod 777 /etc/init.d/tightvncserver). Add the following contents to this new tightvncserver file:

### BEGIN INIT INFO
# Provides: vncserver
# Required-Start: networking
# Required-Stop:
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: Starts VNC
# Description:
### END INIT INFO

export USER='pi'
eval cd ~$USER
case “$1” in
start)
su -c 'vncserver -geometry 1366x768' $USER
echo “Starting vncserver for $USER”
;;
stop)
pkill Xtightvnc
echo “vncserver stopped”
;;
*)
echo “Usage: /etc/init.d/vncserver {start|stop}”
exit 1
;;
esac
exit 0

5) Run the command “sudo update-rc.d tightvncserver defaults” in a Raspberry Pi terminal. This will start automatically start a VNC server when the Raspberry Pi first boots up.

6) Now disconnect the Raspberry Pi from the monitor.

7) I connect to my Raspberry Pi from a Fedora 18 Linux desktop. After connecting the laptop to the Raspberry Pi with an ethernet cable, I then need to ensure that both machines are on the same network. In Fedora this is done by first disabling wireless on the laptop. I then navigate to the network settings page (Activities > System Tools > System Settings > Network on Fedora). I then select the Wired option and click Options to manually edit my wired network settings. This opens up a window, where I select the “IPv4 Settings” tab. I change the Method dropdown from “Automatic” to “Manual”, allowing me to manually set the wired IP address of my Fedora laptop. The Fedora IPv4 Settings window then has an Addresses table, where I click the “Add” button to add a new network entry.

I give this new entry the network settings of:

Address: 192.168.200.1
Netmask: 255.255.255.0
Gateway: 192.168.0.1

I then save these changes. These network changes ensure that the Fedora laptop and Raspberry Pi are on the same ad hoc network when connected by an ethernet cable. If you’re using another operating system to Fedora Linux, make the equivalent network setting changes to manually set the wired network address.

manual_ethernet_address

8) Start up your Remote Desktop client on your laptop. Connect with the protocol “VNC” and the IP address “192.168.200.100:5901”. You’ll be asked to enter the password you set for vncserver, then a window will open for your Remote Desktop connection to the Raspberry Pi.

remote_desktop_client

You can now display your Raspberry Pi through a laptop without needing to use a HDMI monitor.

Implicit Builder Conversion in C#

I recently attended a Leeds Sharp user group meeting on design patterns. Grant Crofton was presenting an example of using the builder pattern in tests and showed a neat trick that can be used in C# when using builders.

A builder class has a selection of “with” methods corresponding to different properties of the target object, where these are called in a chain. The final method call in this chain is then usually a Build() method that converts the builder object into the desired target object, i.e.

Order order = OrderBuilder.anOrder().withCustomer(customer).withAddress(address).withPostage(2.00).build();

I learned that by using implicit type conversion in C# with the “implicit operator” keywords, this final Build() call can be omitted. For example, here I have two NUnit test methods, the first of which uses a final Build() call for the builder object and the second which uses implicit type conversion:

[TestFixture]
class AddressTest
{
    [Test]
    public void AddressWithHouseNameAndNoNumberDisplaysCorrectly()
    {
        Address address = AddressBuilder.anAddress()
            .WithHouseName("Dunroamin")
            .WithStreet("Golden Meadow Lane")
            .WithTown("Otley St Catchpole")
            .Build();

        Assert.That(address.ToString(), Is.EqualTo("Dunroamin, Golden Meadow Lane, Otley St Catchpole"));
    }

    [Test]
    public void AddressWithNumberAndBuildingNameDisplaysCorrectly()
    {
        Address address = AddressBuilder.anAddress()
            .WithNumber("5")
            .WithBuildingName("Hipster Flats")
            .WithStreet("Ironic Street")
            .WithTown("Hipsterville");
        Assert.That(address.ToString(), Is.EqualTo("5 Hipster Flats, Ironic Street, Hipsterville"));
    }
}

The code to achieve this implicit type conversion in the AddressBuilder class is:

public static implicit operator Address(AddressBuilder builder)
{
    return new Address(builder.HouseName, builder.Number, builder.BuildingName,
        builder.Street, builder.Town);
}

The full code for this builder example is available on GitHub

Running Python unittest with command-line arguments

I’ve recently come across a problem regarding the unittest module in Python when using command-line arguments. This problem occurs both with Python 2.7 and 3.3. Here is a simple example of a Python test file my_tests.py to demonstrate this problem:

import unittest
import sys
from production_code import get_thing

class MyTests(unittest.TestCase):
    def testFirstThing(self):
        result = get_thing("first", command_line_param)
        self.assertEqual("new_first", result)

    def testSecondThing(self):
        result = get_thing("second", command_line_param)
        self.assertEqual("new_second", result)

if __name__ == '__main__':
    if len(sys.argv) != 2:
        sys.exit("ERROR: A command-line parameter must be supplied for these tests")
    command_line_param = sys.argv[1]
    unittest.main()

When I run this script with

python my_tests.py foo

, I get the following output:

Traceback (most recent call last):
  File "my_tests.py", line 18, in 
    unittest.main()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/unittest/main.py", line 94, in __init__
    self.parseArgs(argv)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/unittest/main.py", line 149, in parseArgs
    self.createTests()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/unittest/main.py", line 158, in createTests
    self.module)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/unittest/loader.py", line 128, in loadTestsFromNames
    suites = [self.loadTestsFromName(name, module) for name in names]
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/unittest/loader.py", line 100, in loadTestsFromName
    parent, obj = obj, getattr(obj, part)
AttributeError: 'module' object has no attribute 'foo'

I’d used unittest.main() many times before without issue, and it wasn’t too clear from the error message what the problem is here. What’s happening is that if command-line arguments are supplied to the test script in sys.argv, unittest tries to use these arguments as test cases. For the example above, unittest.main() is therefore looking for a TestCase object called foo, or a callable object called foo that returns a TestCase. As neither of these are present in my_tests.py, the above error is therefore occurring.

There are two ways to resolve this problem:

1) Extract the command-line arguments into separate variables and then remove them from sys.argv before calling unittest.main(). This is done by replacing the block at the bottom of my_tests.py with:

if __name__ == '__main__':
    if len(sys.argv) != 2:
        sys.exit("ERROR: A command-line parameter must be supplied for these tests")
    command_line_param = sys.argv[1]
    del sys.argv[1:]
    unittest.main()

2) Run the tests with an alternative unittest function to unittest.main():

if __name__ == '__main__':
    if len(sys.argv) != 2:
        sys.exit("ERROR: A command-line parameter must be supplied for these tests")
    command_line_param = sys.argv[1]
    suite = unittest.TestLoader().loadTestsFromTestCase(MyTests)
    unittest.TextTestRunner().run(suite)

I think that solution 1) is neater and easier to follow.

Installing FX/Ruby on Mac OS X

I’m currently teaching myself Ruby and recently needed to install the FXRuby graphical toolkit. I was initially expecting this to be as simple as

sudo gem install fxruby

However running this command on my Mac OS X 10.8 machine returned the following error output

Building native extensions.  This could take a while...
ERROR:  Error installing fxruby:
	ERROR: Failed to build gem native extension.

        /Users/dchetwyn/.rvm/rubies/ruby-1.9.3-p194/bin/ruby extconf.rb
*** extconf.rb failed ***
Could not create Makefile due to some reason, probably lack of
necessary libraries and/or headers.  Check the mkmf.log file for more
details.  You may need configuration options.

Provided configuration options:
	--with-opt-dir
	--without-opt-dir
	--with-opt-include
	--without-opt-include=${opt-dir}/include
	--with-opt-lib
	--without-opt-lib=${opt-dir}/lib
	--with-make-prog
	--without-make-prog
	--srcdir=.
	--curdir
	--ruby=/Users/dchetwyn/.rvm/rubies/ruby-1.9.3-p194/bin/ruby
	--with-fox-dir
	--without-fox-dir
	--with-fox-include
	--without-fox-include=${fox-dir}/include
	--with-fox-lib
	--without-fox-lib=${fox-dir}/lib
	--with-fxscintilla-dir
	--without-fxscintilla-dir
	--with-fxscintilla-include
	--without-fxscintilla-include=${fxscintilla-dir}/include
	--with-fxscintilla-lib
	--without-fxscintilla-lib=${fxscintilla-dir}/lib
extconf.rb:31:in `find_installed_fox_version': couldn't find FOX header files (RuntimeError)
	from extconf.rb:134:in `'

Gem files will remain installed in /Users/dchetwyn/.rvm/gems/ruby-1.9.3-p194/gems/fxruby-1.6.25 for inspection.
Results logged to /Users/dchetwyn/.rvm/gems/ruby-1.9.3-p194/gems/fxruby-1.6.25/ext/fox16/gem_make.out

Although this error message is a little ambiguous, investigation revealed that the error was occurring because the FOX toolkit was not installed on my machine. FXRuby is a Ruby interface to the FOX graphical toolkit, so therefore requires FOX to be installed on the machine first. I already had the Homebrew package manager installed, so installing FOX was as simple as

brew install fox

Running sudo gem install fxruby a second time then successfully installed FXRuby.

Thank you GIST

This post is to say a huge thank you for the wonderful work done over the last three years by the GIST Foundation. GIST (Grassroots Innovation in Society and Technology) is a Sheffield organisation that was founded in early 2010 and sadly closed in December 2012. The aims of GIST were to provide a meeting place for those interested in technology and to improve the role of technology in wider society.

GIST was run by Jag and Hannah Goraya, Chris Murray and Ian Ibbotson. These four worked hard as volunteers and gave up a lot of their free time to build the GIST Foundation into a very successful organisation that played a huge role in the local tech community. The GIST Lab close to Sheffield train station was used by the Foundation as a meeting point for lots of technology user groups. GIST was responsible for many Sheffield geeks first meeting each other and several of these meetings led to software developers finding new jobs.

I first joined the Sheffield tech community in June 2009. I had just finished working as a postdoctoral university researcher in Mechanical Engineering at The University of Sheffield and was looking for work. I’d decided to make a career change and become a software developer. Although I’d been programming for 7 years at that point using MATLAB, Java and Python, I’d never worked in a commercial software organisation. I was aware that writing single-user scripts for scientific computation in academia was quite different to writing software commercially in conjunction with other developers.

I first began attending GeekUp Sheffield in the Showroom Cafe Bar. On my very first visit I was greeted by the organiser Jag Goraya, who took time to welcome me with friendliness and talk about my interest in software. I kept attending GeekUp Sheffield and got to know other developers. I was impressed that these were people who gave up their own time to improve their software knowledge and who believed in teaching themselves new skills. They seemed to have an inner confidence that although the problems that they were working on were difficult, they could resolve them given time and learning. As a former PhD student and postdoc I also believed in independent working and a self-teaching mentality, but I’d never seen this same mentality before in a software context. I resolved that I didn’t just want to become a software developer, but I wanted to become *this kind* of software developer. I found my first software developer job shortly afterwards in August 2009.

In 2010 GeekUp Sheffield was renamed to GIST Magazine and continued to meet on the first Wednesday of each month at the Showroom Cafe Bar. The format had now changed though, with Jag Goraya acting as compere and multiple presentations being given within the two hours. The atmosphere was relaxed and friendly, with questions being asked in an open environment.  I saw some wonderful GIST Magazine talks in 2010 such as Ash Moran and Marc Johnson talking on promiscuous pair programming and my software knowledge grew. I did a GIST Magazine book review on Cory Doctorow’s excellent For the Win in late 2010 because I knew that this would be the audience to appreciate such a book. I was interviewed on the book by Jag whilst sitting on a sofa in front of the other GIST Magazine attendees, which felt at the time like being on Parkinson!

In 2010 I started attending software user groups at the GIST Lab such as Sheffield Ruby Group, Sheffield PHP Group and Open Data Sheffield. The GIST Lab could seat up to 20 people and was an excellent venue for enthusiastic developers to see talks or write code on topics that they were passionate about. I also began attending software user groups at Madlab in Manchester, such as XP Manchester, Python North West and Manchester Free Software. I continued to study new software techniques and programming languages in my own time, motivated by other GIST regulars improving their own skills by doing the same thing.

At the end of 2010 I decided that I wanted to organise a software user group at GIST myself. My favourite programming language was Python, after having using it for scientific computation during my time as a university researcher. I was confident that there was sufficient demand in the Sheffield tech community for a Python group and I confirmed this by conversations with fellow developers and on Twitter. I started the Python Sheffield user group at the GIST Lab in Jan 2011 and was thrilled when there were around 20 attendees at the first meeting. Future meetings then continued to thrive with a wide variety of Python content and regular coding sessions.

GIST Lab
The GIST Lab

It was the GIST Foundation that made this possible. The GIST Lab was freely available for software user groups to use, where this was a dedicated room where developers could learn, code and socialise with their peers in comfort without being disturbed by noise. Every month we had two hours where we could talk about the Python topics that interested us and learn from the experiences of others. One of my favourite Python Sheffield meetings was where core ipython developer Thomas Kluyver gave us a set of ipython bugs and we worked in 5 or 6 developer pairs to fix them and then submit our changes back to the ipython GitHub repo. Other user groups such as Systems Thinking, WordPress, Makers, Raspberry Pi and Open Rights Group later formed at the GIST Lab.

My confidence grew as a software developer and I found that the skills I was learning at GIST user groups regularly came in useful during my day job. I was constantly building up a list of the techniques, programming languages and software books that I’d like to study next, whilst working my way through that list.  In mid 2011 the GIST Foundation organised a two-day BarCamp at the Workstation attended by large numbers of developers, with many travelling from outside Sheffield. The Sheffield tech community was strong and was able to represent itself proudly as part of a great UK city.

In mid 2012 I was working as a .NET developer and considered C# to be my main programming language. I had gained confidence from running the Python Sheffield group for almost 18 months and decided to start up a new Sheffield .NET User Group. Again the GIST Foundation was very supportive of this and offered the GIST Lab each month free of charge for our meetings. The Sheffield .NET User Group attracted a different set of developers to Python Sheffield, but both groups were comprised of hard-working developers who believed in teaching themselves new skills and learning from their peers.

In October 2012 I started work at ThoughtWorks in Manchester, a software organisation it was my ambition to work for one day. GIST has played a major role in this, as I firmly believe that if I hadn’t started attending GeekUp Sheffield in summer 2009 and then multiple GIST user groups, I wouldn’t be a ThoughtWorker now. GIST has given me the opportunity to learn lots of software knowledge from a great group of fellow software developers and share that knowledge with others. As an organiser of two software user groups, it’s very rewarding to have an environment such as GIST where knowledge can be shared amongst smart people willing to learn. In the second half of 2012 I was honoured to become a key holder for the GIST Lab and have the responsibility of opening and closing the venue for user groups.

In December 2012 it was announced that the GIST Foundation and GIST Lab were closing. This was sad news and I feel that I owe them a great debt. There are many people I wouldn’t have met without GIST and they’ve helped the Sheffield tech community to grow. I’m confident that this community will continue to thrive and that the connections made through GIST will be maintained.

Thanks again to Jag, Hannah, Chris and Ian for all of their hard work over the last three years and to all of the regular GIST attendees. I’m very grateful and I’m sure that many other Sheffield developers are too.