Following up on my "Finding Python people is hard" I figured I'd send the call out again. We're looking for local-to-massachusetts (we're in Acton, MA) people who are interested in joining a dynamic, quality-focused test/automation team. Ideally, candidates are fluent in both testing (areas may include: performance, regression, web, streaming video, storage) and Python programming.
If you're a strong testing person with some programming - maybe you're not fluent in python - we'd still be interested: We have no problem teaching you Python. If you're a strong Python person, but maybe without a testing background - you're also welcome. Internally, we use Java/C++ and Python - experience with all, or some of those languages is great.
Even if you're just starting out - perhaps you've just graduated college - we're looking for people that want to be great engineers. We look for strong engineer skills, contribution to open-source work - we're looking for people of many skill levels to join the team.
The role is for someone to join the Engineering team with a focus on Automated test engineering. We don't slot people into "just testing" or "just dev" - we hire great engineers, and people who want to be great engineers. The core developers help drive testing, and the testing-focused people help drive core development. The entire company is focused on providing the highest quality product to our customers.
As part of this role, you will be developing everything from simple unit tests to highly complex functional level tests. Some of the more challenging aspects include the fact that the product itself is very performance-driven, so the tests we develop (in Python) have to be able to drive a product capable of pushing tens of gigabits of video data across the wire to it's very limits. The product uses distributed technologies and is a loosely-coupled system - we have to test and prove that as well.
Internally, we're using such tools as the processing library for concurrency, Nose, YAML, etc. We encourage open-source contributions and community involvement (see the nose plugin I recently open sourced, and the PEP 371 work I've been able to do) and exploration of new technology that might help us devise more efficient testing strategies. If you like pushing boundaries - this is the place for you.
A great example of one of the challenges is a test I've worked on for some time - I've had to design a highly concurrent test that can leverage a single test client's resources to the max to drive load against the system, while also generating statistical anomaly events to trigger internal behavior to the system. Of course - just designing it for one test client won't scale: This test has to be locally concurrent as well as have the ability to spread out to multiple testing clients. Oh - and it has to generate hundreds of gigabytes of data as fast as it can to push the system.
Some of the technologies I've been personally exploring are the Actor-Model approach to concurrent programming, Twisted for asynchronous/concurrent testing, etc. No technology or approach is excluded - we approach all of the test development with the zen of python in mind:
There should be one-- and preferably only one --obvious way to do it.
And just to add to that: There is only one way to do it: The way that works. If we find that an old approach doesn't scale or do what we need it to do, and we have a new approach that can do it better, faster, cheaper, etc - we're not afraid to adopt it.
This is a startup: and we're really ramping up on the automation of the tests - so nothing is set in stone. We use Ubuntu, OS/X and even Windows for the development environments. Pick your editor, pick your machine - the team is focused on making us, and you successful. Every engineer in this organization is empowered to do what it takes to get the job done.
If what I am saying sounds interesting - send me an email, or post a comment. I'm very interested in talking to you.