These notes are owenberesford.me.uk/resource/phphants-2015-feb

Who am I?

  • Some guy.
  • BSc Computer Science from UL, MSc Internet Systems Development from UP,
  • I have quite a few smaller arts qualifications
  • ZCE, BCS ~ will try to get Sensio certified this year
  • OO Software engineer not web-dev.
  • @channelOwen // http://owenberesford.me.uk/resource/contact-me

What have I done?

  • Author of this site ~ some resources have >50 references, ~ 120,000 words
  • Built quite a few document or documentation systems
  • Used PHP for high performance services
  • Write OO JS, jQuery etc
  • Extended/ revised services such as Asterisk, PowerDNS, qmail in C
  • Written complex fiddly things in Perl
  • As a graduate, wrote graphics pipelines in Java
  • Alot of B2B XML
  • Fixed alot of bad UX, where the author was thinking about code, not users
  • Built and maintained reasonable scale systems (10^6 useful transactions per month)
  • MVC blah
  • Have never used Drupal, Wordpress, Wix, site building tools etc

Describe my process

  • This is industry standard.
  • Try to get a list of stake holders, try to poll each of them
  • Try to have a list of requirements
  • Consider BDD ~ some employers don't like it, and BDD is only for complex or changing systems
  • Work out how to test, even if not BDD ~ isolated changes
  • Talk with stakeholders, is quality or turnround time most important?
  • Log everything in the task tracker
  • Write outline / interface / static UX file
  • Build unit-tests for classes
  • Build a minimal implementation ~ so can confirm no surprises ~ such as no more server IPs
  • Performance may be thought about, but performance is built after features
  • Retest, review with stakeholders
  • Re-factor code, add lesser features, add more tests
  • UAT as relevant
  • Use tools for deployment, documentation, VCS etc

Tests platforms

  • Built/ admin a VM cluster for network software testing.
  • Built an XML regression tester.
  • My PHP tightly factored classes here have stand alone tests similar to Test::More.
  • Managed AAA tester, built v2 that scaled/ managed alot better.
  • Built XML stress tester, AAA stress tester.
  • Managed SVN properties and commit hooks.
  • Built/ maintained demarcation test suit with suppliers.
  • Built standard migration scripts.
  • Injected units and unit-tests into a new employers entire platform.
  • Built DB comparator, to ensure platform consistency.
  • My website has outside-in tests for every single file format feature ~ about 200.
  • Did some Selenuim E2E tests, hard to manage.
  • Built into a mac CI.
  • Built custom JS tester (extending qunit).
  • Built into another CI.
  • Built alot of Perl unit tests.
  • Real BDD via Behat.
  • lots of PHPunit.
  • FE tests via Browshot, Sourcelabs.
  • Got annoyed with the non OO nature of Test::More, building punit.
  • Looked at Swarm.

My talk

  • My talk is on testing, not “boring history of some guy”.
  • I don't need to think that much when writing code, my tests tell me things.
  • I don't get stuck, I know what I'm doing.
  • As objectives are compilable, time scales are easier.
  • Performance refactoring is measurable, and affordable.
  • If you are a professional, this talk may not cover much new ground.

Case study : <cough> merchant site

<Cough> is a narrow market vendor; and has about 20% saturation of the London market. <Cough> is doing a platform rebuild, they need it. As BA, I think are trying to move from an entirely phone based provisioning ~ in practical terms ~ to a hybrid new marketing model. The older systems have no testing culture. I did a 30min trawl of their site before an interview.
Their existing business system:

  • Links between multiple self brands fail ~ hhtp://... anyone?
  • i18n fails ~
  • JS fails ~
  • UX things that I would change
  • HTML3 tags
  • Business process depends on call center staff, not the website
  • The current CTO was hired to revise the above situation, and is working to manage/ change this.

Case study : a certain website, with white and pale blue branding

This guy wanted PHP5 experience and so wrote a website via an expensive mechanism. He wrote his initial build, tests, and added alot of content. For the next five years, added more content, tests, UX and a few features. Mostly content. There is roughly theme support; it has been re-skinned to be better practice, as more time was invested.

  • There was initially very simple code && I knew I would refactor alot, so I wrote comprehensive outside in tests.
  • I wrote tests aggressively simulating attacking requests, but no stress tests.
  • I tested my first skin on all available browsers.
  • I tested my content (i.e. words) manually.
  • I updated the crypto used in sessions. (real tests at this point).
  • My second skin was tested on browshots, and RWD tested manually.
  • The JS has unit tests as it is written.
  • My content now has a few things that can be tested automatically.
  • I wrote an RSS asset, a sitemap asset, which worked when I wrote them. Google and everyone use these features, its not possible for them not to have worked.
  • In Jan 2015 I found this didn't work on live.
  • They did on my local machine, all tests also pass (on local).
  • I backtracked, and as far as I can tell my host has changed versions of PHP.
  • As this is now 2015, use an EC2, they ship with GIT, and PHP-cli. This would make my life so much easier.

Questions now?

Quick question break...

Things that I don't test:

  • Exceptions classes
  • Code without branches
  • DI functions / getters / setters ~ unless complex
  • IMO “worker” code should be functionally tested not unit tested
  • Controllers without logic %%

Things that I do test:

  • UX / UI / design transparency as stakeholder - AT, then UAT
  • My speling [sic]
  • Params / types ~ also as asserts
  • Logic
  • and again
  • and again when the requirements change
  • Routes %% ~ which express a lot of complexity
  • Templates where possible
  • That exceptions occur at the right point

Tools that are useful:

  • Unit-test frameworks obviously
  • Event managers such as Jenkins or cron
  • Scripting engines : bash, vim, emacs, Jenkins, expect etc etc
  • Platform debug tools e.g. Symfony2 route:match
  • Validation tools, i.e. post to jshint, HTML through tidy
  • “Big” debug tools: wireshark, strace, xdebug, kmesg
  • Test db should run with querylog ON.
  • Test server should have interpreter warnings on loud (remember type casting costs CPU cycles)

Questions now?

...

Test world view

  • You must have a systematic perspective and process.
  • You must test against the requirements, not the code.
  • As a pure tester, ignore the work; just look at the state space.
  • To say a third time, test against reasonable scenarios.
  • Breaking things down precisely, allows people to understand what they need to change.
  • As a developer, I find best results by doing tests on different days to implementation.

Integration testing

  • I tend to call this API testing.
  • In terms of a SOA test on one API, rather than each class.
  • Needed for things like long life cycle API.

Regression testing

  • Most important when refactoring
  • Is the external interface the same as it was?
  • Can be built out of your previous API or unit tests.

Manual testing

  • This is the best known way to do RWD tests.
  • and similarly for random little CSS adjustments
  • Is very time consuming and doesn't scale.
  • There are some things that it is hard to write unit tests for
  • Needs to be done by a separate specialist.
  • This is necessary for looking at the text in error messages and similar.

Exception testing

  • Tests to induce exceptions to confirm that these happen as expected.
  • Generally more expensive to setup than unit tests.

Installation testing

  • Test your install process, not your code.
  • Requires installation automation.
  • Very important to test that installing a new release doesn't damage the existing clients/ products/ transactions.
  • Case study: one of the companies that I worked for, had some contracted out development; the external contractor had a drop all tables as part of the installer.

Stress / scalability testing

  • Setup scenarios where you are doing alot of your thing.
  • Only useful after the unit tests all pass, and the API tests pass.
  • Focus should include normally memory leaks, which are unlikely to be monitored in unit tests.
  • Should be done as an application stack, rather than code modules.

Behavioural tests

  • In PHP itself, Behat or PHPspec.
  • Tests can be written in tools like Selenuim at a higher level.
  • Must test against the spec, not the implementation:
    • scenario: a behavioural test against an eCommerce “stock displayed count” should not look too closely at HTML structure, as the HTML may get edited to work on different platforms.

Security testing

There is no such thing as security, and there never has been.
~ G Greer.

Test groups

  • Very short slide, this makes it easier to manage tests.

Heuristics

  • Do you have a test for every line in the requirements (functional)?
  • What state does a given test cover (think FSM)?
  • Try to cover all meaningful states
  • What obvious illegal states are similar to this?
  • Where the implementation involves several languages, what happens when you supply one of them inline as input?
    • i.e. feed shell exec commands via ` [backticks]
    • feed in SQL commands
    • feed in attempted buffer overruns (hard for PHP users to trap directly)
    • feed in malformed UTF8
    • feed in unexpectedly large data
    • feed flaots for ints and visa versa
    • for XML, feed in malformed tags
      • or declarations, which can crash PHP

Most of these can be built very quickly, so its not expensive to test.

Any Questions?

...

Not a test, fault tracing

This rarely occurs in systems with unit-tests, but you have an error, try this, but:

  • From the behaviour, what type of error is it?
  • Is the issue happening on “happy path” execution?
  • Think of things that you can do that divide the problem space in two.
  • ...

Not a test, logging

My first employer liked logging, I also use it. This displays state, and may assist where test cases don't cover. Logging enables hindsight where a debugger can't be used.

Design to test 1

  • Write your API in reasonable size chunks, so you can see what is happening when you unit test.
  • Try to avoid “invisible inputs”, hard to test.
  • Try to avoid global variables.
  • Have good automation, so test setup isn't manual.
  • All bytes are equal, there is nothing bad about making an RDBMS table for a test, then deleting it afterwards.
  • Have good revertable-ness, so tests can be repeated.

Design to test 2

  • Try to build so that you can do meaningful tests on each of the component interfaces (i.e. web server out put, AJAX requests in the other direction).
  • Format validity tests should always be done via existing tools (e.g. tidy on XML).
    • And should be done often, and for no money.
  • Content tests are commercially important, and a pain. Give them 50% of your budget, in marketing/ eCommerce systems.
    • and the highest cause of re-work.
  • With good systems, test may be run against live.
    • mIS had a test client, whose orders never got processed.
  • Ensure that it is easy to access platform logs (in a secure fashion).
    • Maybe audit people looking at the logs?
  • REST is not glamorous, but its cheap & open.
  • In some situations, a formatter will specifically tell you about failures. Not testing, but very useful. I do this for SQL.

Design to test 3

  • Good use of meta types increases dev performance
    • and is frowned upon by some OO authorities.
    • Do not test your typing, test your solution.
  • Good use of the properties of numbers increases dev performance
    • and is frowned upon by some OO authorities.
    • because they don't like maths? erm. or its harder to teach ?
  • Borrow devops ideas, everything is code & may be unit tested

Any questions?

...

Ideal testing

  • Not really cost justified in most situations.
  • This list an imaginary app which is very stability focused.
  • Use a framework, ensure frameworks tests are good enough.
    • Submit additional tests to framework repo.
  • Get requirements.
  • Build complete functional test suite against these ~ behat?
  • Ratify.
  • Build empty class API, add unit tests via annotations, iterate API as needed.
  • Build minimal code solution.
  • Ratify / review.
  • Add remaining features and gloss.
  • Retest.
  • UAT testing.
  • Deploy bundle.