Daniel Harrison's Personal Blog

Personal blog for daniel harrison

PHP, The Greasy Mechanic Of The Web October 31, 2007

Filed under: development — danielharrison @ 1:07 pm

He’s greasy, often has bad BO and sometimes can consume 100% of your time when your car’s undergoing a major service.  During peak times he doesn’t always scale well and can’t help you out until he can get more guys on, so there’s a bit of a lag if something urgent comes up during a peak time.

Sometimes he’s unintelligible and isn’t consistent.

But he gets the job done and you put your life in his hands.

There often seem to be two schools, the dealer mechanics who have a fleet of guys and associated apprentices. They do one particular model and they do it well, if you need something else then you’re out of luck, it’s out of scope.  The service centre has a white floor and walls and big movable red racks of tools. You don’t see grease, car guts anywhere. They pick you up and drive you to and from the service centre. You lose your car for a day and and if anything goes wrong even longer. Expensive computerised record keeping and billing is the norm. They ring you when you need a service. You get a detailed accounting down to the washers and spark plugs they had to replace they used.

The other is the local guy down the road, walls and floor pitch black with soot, grease and oil. Old parts stacked down the side where you walked past, because they might need them at some point in the next 20 years. Record keeping is a torn and tattered book with blackish stains and finger prints. You have to take the initiative, you walk into the tiny office after managing to find the door and when you hit the bell a mechanic greats you after a couple of minutes. You ask him for a booking, he digs out the book, stick’s his finger in his ear to get that annoying piece of ear wax which has been bothering him all day, has to yell out to johno in the back because the bookings are a bit smudged but you innevitably get a booking. You get a smudged crumpled account with one item,  ‘service’.  You’ve got to trust that he did the right thing, sometimes he’ll let you down but most of the time they’ll save your time and money. You feel lucky and privileged.

PHP apps I use at work and home, media wiki, word press, phpBB

 

The Pig On My Desktop October 29, 2007

Filed under: development — danielharrison @ 1:49 pm

Writing this out of frustration. Visual studio (2005 SP1) is a pig eating my desktop!

It strikes me it’s a good enough tool written by people who have never used anything else. My background is ‘enterprise’ java but part of my job is requires me to work in the .net world where I mess about with things like c# winforms, .net web services and occasionally a bit of asp.net. For a good chunk of my day on one monitor i’ll have IntelliJ IDEA and visual studio on the other. I’ve used many editors over time, JBuilder, eclipse, netbeans, emacs, vim, …. but it seems the least responsive and most annoying appears to be visual studio. I do have Resharper installed so I can have the same semantics as IDEA, but even with it off it seems like a pig.

My current gripes are

  • Hangs everything. Startup blocks other operations. Build blocks other operations. I can’t do a build and switch to another tool.
  • I have my start bar on the left with autohide, starting visual studio changes the z index so it’s under every other window. I have to hit the windows key and click on a grey area to force if back to the top.
  • Default behaviour appears to continue compilation even if errors occur, fail early fail fast? http://www.ftponline.com/vsm/2003_10/online/hottips/sabbadin/ shows how to change this behaviour.
  • Having a panel auto hide doesn’t guarantee that it will stay that way. Every day I have to at least once set various windows to stop autohiding and then autohide to have it stick again.
  • This might actually be resharper but having windows dragged off into second monitor after a run will move back to the first!
  • Tooltips seem to be late and laggish.

This is on a 2.16 dual core, 2 gig ram laptop running XP with a primary 7200 rpm drive. I followed some of the steps in this useful visual studio speed up guide but my main gripes are the building and startup which this won’t address.

Another gripe is the use of external API’s. This may be heresy (from the blogs I read I didn’t think it was but they may be biased [alt.net]) in the .net world but we use open source components in our tools. Java tools fundamentally have to assume that you will be using external libraries so making it easy in the editor to discover an API and how it works is important and it’s something most do well off the dot keypress. When you have a company that does so much in a relatively isolated environment then this kind of thing just doesn’t happen. I think there’s a fundamental gap with the community and Microsoft. There are great people and developers both near to and within Microsoft trying to close it but being such a big company that has a schizophrenic relationship with open source means that internal solutions gain precedence over community or external tools. eg MSBuild vs nant. Java and other tools/languages because they’re forced to exist in the open domain, inevitably adopt the most valuable, best of breed and community accepted tools as first class development tools. Think of it as Darwinian selection for software development. I shouldn’t need a different version of visual studio to integrate/write/think about unit tests.

Another bugbear of mine is why is msdn so slow? sure it’s pretty, but i don’t want pretty, I want to GTD. Eg. ‘java list sort’ in google comes up with the direct API doco which is a simple navigable html page.  I think making this stuff hard to discover means that ‘c# list sort’ (again in google) means that the API doco isn’t the first result. It seems that the developers exist in a walled garden, there are openings and some adventurous folk who venture outside, but a good deal of people prefer the garden.

I just saw this http://beautifulcode.oreillynet.com/2007/10/skinny_languages_and_fat_tools.php . Fat tools or intelligent tools are great but it’s the whole package and adopting community driven technologies as first class concepts is a way of leveraging the intelligence of the community. I just don’t see Microsoft adopting the good things coming out of the community. Sure it’s Microsoft and they’re under no obligation to do so, but the competing model of open development to me at least seems to result in cheaper, quicker and more innovative outcomes. I think companies such as Google, Sun, Novell have all navigated this new paradigm much better.

Individually all the issues I’ve outlined above are really just niggles. But the whole package cumulatively seems just to make my experience as a developer more painful than it should. I don’t tend to get excited when I write in .net because the niggles just distract and make it an unpleasant experience.

 

Interviews, Passion and Languages October 19, 2007

Filed under: development — danielharrison @ 7:46 pm

Raganwald posted an interesting article on learning programming languages; “When learning a new language or tool, do not shy away from things that seem different, weird, confusing, counter-intuitive, or unfamiliar. Do not put them off because you can work around them. Learn them, try them, and persist with them until you discover why they make sense. This is the true road to teaching yourself a new language. And in today’s fast-moving industry, it’s a road we all share.” – raganwald, The challenge of teaching yourself a programming language

My current role requires me to screen and interview people coming into the R+D team. We look for smart, passionate people and developers in our team are expected to know both C# and Java which apparently is relatively rare. We don’t screen if you only know one, but we make it pretty clear that you’d be expected to learn whichever you didn’t currently know. For some candidates this is a big deal, not because of language favouritism, but the fact that it seems learning is a big deal for them. That tends to not bode well and they tend to follow through with poor responses to non language specific algorithmic and design questions (How would you move mount fuji style). I think one great commonality shared by the guys on the team is that they’re all very clued in and are prepared to jump right in. They look for new solutions to novel problems, and languages, many hammers, is a requirement I think. Some have even had great success and written their own language. I tend to think it’s a good signal for passion for the profession in general. My feeling is that smart people tend to relish challenges and being encouraged and given the opportunity to learn a new language the commonality between them is that they jump at the chance. Learning languages is something that developers just do and the more you do it the easier it becomes.

For R+D or product companies I think hiring people purely for a particular language skill is in the medium to long term is a destructive strategy. My guess is the peak language span is about an average of 10 years following the technology adoption curve. Code depreciates and rots over time. I think that includes languages. Over time classes of problems become more efficient to solve due to the language features. This is one of the fundamental reasons I think cheap meta languages are such a ground shaking concept because your language can evolve as your problems do, but that’s another post. For development/products shipping organisations unless you have people driving the product forward underneath, then relatively to your competitors and new upstarts, you’re going backwards. It is chaotic, and when you’ve got a group of smart passionate people all pushing it can be like herding cats, but the upside is innovative and game changing software. My view is great software engineers can’t stand imperfect solutions, it grates, and languages, with different paradigms and lessons are a way of finding the most efficient solution for the problem. Problem solving is fundamentally a good chunk of what successful software engineering means today. (and well hopefully in the past as well :). That means, having someone on the team that is only prepared to deal with problems with a single hammer will inevitably hold the team back. Pragmatism is required, but people need to be able to make the jump.

At least in the market in which we’re operating at the moment (Australia) it seems there a lot of people coming from overseas and doing their masters, or even Australians are coming through the undergrad program and missing fundamentals. For undergrad study in Australia it used to be more like a US masters but that’s changing very rapidly to become a US style system and I think this will only compound the problem of fundamentals and specific field depth. I think a lot of uni’s are now essentially vocational programs and are becoming single language courses which is not good for the profession. At the uni I went to we never programmed on windows machines, it was always a Unix variant. The business and commerce side had windows labs but not for IT. It was made abundantly clear that for marketable skills you would be expected to learn to do that on your own time and that’s the way the world worked.

Software engineering is a profession where aggressive learning is a requirement for your entire career and my gut feeling at the moment is that there isn’t enough people taking up the challenge. The current state of play in software is fundamentally due to progress from standing on the shoulders of giants and to take part, that means learning and learning from languages.

 

WordPress Theming October 16, 2007

Filed under: development,internet,me — danielharrison @ 12:59 am

Small Potato (should I not be using capitalisation?) created a very useful resource on how to create a WordPress theme.

I found it most useful in understanding how WordPress is hooked together. Having recently joined the blogging fraternity I needed to create a theme but went about it in a slightly different way. If you’re familiar with css, html, php and the like, but like me, just starting writing and itching to be up and running, then this is the alternative technique that I used. I tend to use this approach more generally for any site that’s already got well formed and structured content and treat it as a website refactoring.

First things first, make sure you’ve got firefox, the web developer extension as well as a trusted editor (I use editplus).

Choose your theme, I just switched to the classic which seemed pretty simple and wasn’t going to be too hard to dig through and edit. Add a comment and some sample text you can use that’s going to give you a representative sample, throw in a few paragraphs, lists, bold and italic text so that you’ve got a good sample of what’s going to go up on your blog.

Next download the theme or just copy the existing one and rename it to your site so you can make changes locally and offline, eg Classic -> knowtu in my case.

I didn’t particularly care that I would have an incomplete site that’s available for a few days so I could do this live and didn’t really need a local wordpress install or config. If you do, Small Pototoes’s guide links through on how to do this. I think from memory in Suse for example it’s available as an app to install via the yast and being a php app it’s pretty straight forward anyway.

First step, save the page using firefox’s save page as, save it as ‘Web Page Complete’ to a suitable location.

Save Page as in firefox

Secondly, Open the content in firefox and your editor and include a reset.css. Why? well once everything’s on a level playing field it’s much easier to go forward as well as being closer to deterministic. I first came across the reasons for a reset css via an msdn blog. Secondly change the existing style.css reference to a local one. Finally remove the content from the style.css. Viewing the page in firefox should show a pretty blank page with all fonts the same (including headings) which is exactly what you want as the next step is to start moving and sizing things and you want a blank slate to go nuts.

Change styles to local and include a reset.css

Start looking at the content and figuring out how it all hangs together. I also ran htmltidy over it, as if you want to grok the content, then well formed and layed out is usually easier.

Where the web developer extension comes in, is that it gives you way to edit the sites css and see the changes instantly so you can play around and see how you want things to sit. You can enable it via clicking on the CSS dropdown and selecting edit CSS, or ctrl+shift+e. You should see a display like the one below.
Firefox edit css dialogue.

So onto the formatting, structure first, typically you want to play round with layout and get a feel for how it’s going to sit on the page which I tend to find starts giving you colour and font ideas. When I was doing mine I also changed all fonts to Verdana and started getting fonts the rough size. When doing the font sizes and structure I tend to use the named measures, eg small, xx-small. The advantage of this is it really shows proportions and if you’re not sure a quick trick I use to visualise is the ctrl+mouse scroll. This quickly ups and downs your font measure by one which is handly to see if you want a bigger or smaller font. This is where the edit css comes in really handy again. If you turn on view style information, you can quickly see the structural mark-up by moving your mouse over the page. Typing in the CSS updates as you go, so it’s a neat, cheap WYSIWYG editor with minimal overhead. If you CSS is rusty or you need a reference, try ILoveJackDaniel’s handy pdf cheat sheet. One thing though, don’t forget to save. From what I saw switching tabs would loose your edits.

Finally start thinking about colour and fonts, if you’ve been playing around with the structure you’ve probably already been doing this or got a colour scheme in mind or even drawn and coloured it out on a napkin. I used Adobe’s kuler to visualise colour themes.

You may during this notice the markup isn’t quite what you like, if you edit the page to be what you want in an ideal circumstance don’t forget to go back to the template and make the matching changes. It’s usually pretty easy to track things down. Refer to the WPDesigner tutorial if you’re struggling. I tend to work better with formatting etc so I mainly deleted stuff like the sidebar content and things I planned on growing later. XMLFriendsNetwork?, who needs it, I’ve got plenty of friends already? I’ve done quite a bit of xml stuff and if they’re still friends after all this time, good for them. I also did some minor formatting as I went to spacing, converting a few things to div’s and added some extra classes so I could identify them better.

Finally copy any css images back in the theme folder, create a screenshot called (not surprisingly) screenshot.png and upload to the themes folder in wordpress. Select it in wordpress and it should now be running. You’ll probably want to run it back through the w3c validator to make sure it’s all kosher, but if you’ve done it right there should be no changes and it will all work out happily.

I tend to use this approach for sites that have well formed content and structure and it’s pretty quick to get a site with a new design. The power of this approach is based on the strength of CSS and the more semantic the markup the easier it is to radically redesign a site in a matter of hours (Granted WP classic is pretty simple).

I point out I also only really cared about IE 7, firefox and browsers that could handle w3c CSS standards and for people likely interested in this blog along with 2/3 rds of the web I figure the majority are going to be just fine with this. Having actually done sites with graceful degradation all the way back to the 4 series and for something that I want to be fun I just couldn’t stand the pain.

So in wrap-up, this got my site up, running and uniquely mine pretty quickly. It’s not complete and hopefully like the content, will grow over time.

 

CruiseControl Custom Listener

Filed under: development — danielharrison @ 12:06 am

The first three points of the joel test, if you’re not familiar with it, basically ensures that your software is always in a known state. You always have a build you can throw to testers, internal users or sales and it means fixing a critical bug is about the bug, and not the build. If you’ve ever been on a project where you have a weekly or monthly integration task where builds and releases ‘come together’ you’ll understand the value of release early and often.

One of the tools I introduced where I work was cruisecontrol. As a continuous integration tool, it means our software is always ready to go. Basically the way we have it setup, clicking build can build all our java and c# source, run unit tests, version and package into all the various installers and publish to the releases directory for testing. The problem at the moment is the last mile. Our UI tests for the winforms components are run in Mercury, sorry now, HP Quick Test Pro which lives out of the continuous integration tool and are run manually. This is a pretty obvious sore point so I’ve been working with the testing team to close the circle.

The last step in our build automation plan is to automate the UI testing into our nightly and release builds. We basically have build machines for each product, and each stream for each product and then another beefy test box which runs all the Mercury tests. This distributed environment means that we needed the test cruisecontrol to be able to watch the releases share to pick up new builds. Because we have multiple products publishing into this directory the filesystem modificationset listener isn’t specific enough as it won’t pick up a single product suite. So to resolve this, today I wrote a custom modification set listener. All up the process was pretty straight forward, and in the end including unit tests and the automated install script for building and configuring build machine, it took about an hour and a half.

So basically it’s the filesystem listener but now takes a base bath and then a regex to watch for new directories being created in that base directory. Whenever we get a new root directory we know that we’ve had a published release and we should run the UI tests. The most confusing thing when writing the new plugin is that technically it appears that the all the modification system classes implement a sourcecontrol interface, which at least semantically, seemed misleading.

All up I think it’s an extremely solid product and the cost of getting the source, figuring it out and getting a new plugin running was simple enough. The only downside I think is that plugins, despite being relatively simple are a bit of a pain to install and run. My current thought was that if we could supply a new plugin or customise via a scripting language it would be considerably simpler rather than deploying a new jar into cruisecontrol and the associated configuration in the config.

 

LOLCat Internet October 13, 2007

Filed under: development,internet,me — danielharrison @ 3:14 pm

TCP/IP version 6 is designed to allow squidillions (that’s the official measure mind you) of Internet addresses. Designed for scalability one of it’s features is to allow every conceivable device that wants to be connected, to be in fact connected. Numbers scale perfectly, need more room, sure no worries, we’re now 2 to 128 (squidillions) instead of 2 to 32 and now every one and thing can have it’s own static IP address and NAT falls by the wayside.

What’s obvious here is that human memory isn’t infinitely scalable so we don’t remember these things and prefer to not be slaves to the machines. This may of course change in the next 100 years and I go on the record now to endorse and welcome our new machine overlords. In the meantime, if we’re still loosing thinks like our keys, wedding rings, and yes, physical money in the 21st century then the humans aren’t the ones going to do the changing in the short term. So what’s the answer, words, a 5000 yr old tradition.

Have you ever played 20 questions? It’s possible under this simple game with a complete stranger with 20 questions, in most cases less, via simple yes and no answers to identify a common element. In fact it’s such an easy thing that computers are trying to get into the action. Mapping words, English predominantly, to numbers obviously is unscalable. There are inherent limitations when you remove context and on top of this we have a rigid hierarchy to which you have to adhere to. Ultimately governmental regulation means that humans with ideals and their machinations define what constitutes valid and invalid names. There have been attempts at alternative more liberal registries but support from vendors is somewhat inevitably lacking.

When I was researching my domain these regulations and then sub regulations in individual countries mean that on top of an inherently unscalable concept, restrictions like trademark law and local government restrictions meant that the limited pool has given rise to what I’m terming the lolcat Internet phenomenon.

If you’ve interested the rules are here : http://icanhascheezburger.com/how-to-makes-lol-pix/

I ended up with .com and may buy the .net to round these things out. I’m in Australia so buying the .com.au which would have been more appropriate but the rules conspire against it. It’s also much more expensive than .net and .com. The rules basically mean if I wanted a .com.au I would have needed to satisfy some quite restrictive rules. EG an ABN [Australian Business Number], Registered business name that maps to the domain … For more info see : http://www.domainnameregistration.com.au/rules.htm

I may start a business on this domain for some software ideas I have so a .com.au would have been appropriate. Under Australian rules I need to have all the above things just to even consider registering a name. An ABN is relatively easy to get (I do in fact already have one and yes I’m late with my BAS) and can be had online without having to talk to anyone. The remaining things start to require certain structures and other people, cosigners, company secretaries, trademarks, more registration; basically the establishment of a business. In this way the Australian system in my opinion acts to squash innovation and makes the .com.au artificially scarce hence more expensive. This tends to mean that the .com is the most appropriate and pretty much a catch all. I’d be interested to find out if this acted to quash startup establishment in Oz compared to somewhere more lenient like the US.

So what do we do while we wait for the singularity and numbers gain the supremacy they deserve? Well not much and as long as these things remain in bureaucracy evolved from the last century, .com and .net are about it really and we celebrate the LOLCat Internet.

The reason I write this of course is that I’ve decided to add my voice to the maelstrom and picking a domain that was both meaningful to me and hopefully readers was a tad trickier than expected. I ended up using http://instantdomainsearch.com/ which was invaluable.