Daniel Harrison's Personal Blog

Personal blog for daniel harrison

Embedded Tomcat 5.5 and Java 1.6 PSA November 27, 2007

Filed under: development — danielharrison @ 12:28 pm

Just a public service announcement, my guess is this particular circumstance will be pretty rare.

If you’ve been running embedded tomcat under a wrapper class you might have some problems when you go to java 1.6 and you watch stdin and stdout.

We ran an embedded instance in our application with a wrapper that communicates via stdin and stdout to our controlling application, piping if you will. It appears that unless you define a root logger the embedded instance will redirect stdout to a null output handler which means you no longer get output. This will mean if you’re writing status outputs to which another application is listening to it will appear to just stop.

try {
    FileHandler fileHandler = new FileHandler(File.createTempFile("embedded-tomcat", ".log").getAbsolutePath(), true);
    fileHandler.setFormatter(new SimpleFormatter());
    logger.addHandler(fileHandler);
    logger.setLevel(Level.ALL);
} catch (IOException e) {
   e.printStackTrace();
}

Also under java 1.6, make sure you redirect standard in when invoking java from a .net 2 process. This previously worked under 1.5 but probably shouldn’t have. My guess is java 1.6 ‘fixed’ a lot of stdin, out stream which meant coincidental behaviour that previously worked no longer does.

See http://www.mail-archive.com/users@tomcat.apache.org/msg36244.html for a thread where someone else was having a similar problem.

Advertisements
 

Why The Era Of The Insider Will Remain November 24, 2007

Filed under: development — danielharrison @ 10:30 pm

Digital Digressions posted an interesting article about insiders and how technology seems to be displacing them. This is a topic close to my heart and I’m going to disagree and say the era of insiders will never end but will merely be a transformation. Why? well first I think it takes an examination of some fundamentals.

In economics there’s a a concept called information asymmetry. Under this theory insiders can use their knowledge of the market and the cost to outsiders of acquiring this knowledge to either better exploit the market or do something like charge a fee or commission to allow outsiders to act like insiders, ala real estate agents. The less open a market or more complicated it is, the more the insider can exploit this and the higher the premium they can charge to outsiders. Lag measures are a symptom of a closed or opaque market in that it takes time for the true state of the market to become exposed to outsiders. Real estate in many cases is a grossly inefficient market which is why the role of the real estate agent exists, an insider who can theoretically maximise profit via inside information. If the real estate market acted independently and house sales were purely a function of house size, block size, dummy variable of a pool, … then you could expect the insiders to disappear as information becomes more abundant and available to all, but that’s not the case. Perfect information is a concept that helps explain this. Perfect information is where all information is perfectly shared between all participants and sellers know as much as buyers. As long as perfect information cannot be obtained insiders will be able to exploit the cost of acquiring knowledge, so will exist and make money from capitalising on experience and inside knowledge of the market. Real estate is not a market that can be captured or explained purely by the numbers. It helps and can act as a dampener on irrational behaviour, but there are things that are not captured purely in closing prices.

In Canberra we have a local property site called allhomes, it has some ridiculous market penetration, they claim that their Nielson independent monitoring is 99% of people who buy property in Canberra visit the site before purchasing. It has information on all sales in Canberra for the last n years, and things like block size, rates, suburb sales info, pictures, etc. It’s great and it’s how we found our home, so at least in this market, there’s some good experience with online property sales. Anecdotally, I’d have to say though that the number and use of real estate agents is not diminishing. The market in Canberra is pretty solid as it’s got some interesting demographics (large public service with a higher than average income) that isolates it from shocks to some extent, so it may not reflect other markets. One thing you do see is that real estate agents list on the site as well as newspaper advertisements. One of allhomes taglines to sellers is ‘before you list, insist’ on listing on allhomes as well. So while I think commissions for insider information is under pressure, sales volume is increasing over time. Buying a house is one of the most stressful things you can do and is a strong discouragement for people to repeat the process. My guess is as the ease of buying and selling increases so will sales volumes but it’s probably too early to tell. In terms of lag measures though, while these sites are now easier to access they’re still lag measures as they reflect the closing price. It takes time to sell a house and good agents can still outperform outsiders using their knowledge and experience of the market. A month can make a huge difference in the closing price of a home. Getting it right when others get it wrong is the best outcome and this is a skill that insiders, investors, good real estate agents have.

One thing to not forget is that this information previously wasn’t available to real estate insiders either. Insiders with their experience and existing knowledge of the market can also capitalise on the new knowledge which allows them to increase their efficiency as well. This means that they in turn can still outperform outsiders and charge a fee for their knowledge. So I don’t see the era of the insider as over but rather inefficient agents being squeezed out and the way agents behave changing. This should make buying a house easier and more efficient and the flip side is in an efficient market typically there are more sales, so insiders who can capitalise on the information, may charge a lower premium, but make up for it in volume now.

This is why I think the future is more in property sites like redfin, where they still have agents (insiders) but because of changes in the market, the agents act differently. One of their staff outlines why they chose to be a redfin agent and highlights the differences between becoming a traditional agent and joining redfin. The point I think speaks volumes is the ‘work cooperatively with fellow agents’. House characteristics mean different things to different people and one of the skills of a real estate agents is intimate knowledge or a neighbourhood, street, house, people that allows them to act as a broker and map buyers to sellers. Previously the market was, an agent has a specific set of houses they sell to everyone, seller driven really, and it was about the stock of houses an agent had access to. The better agents/agencies had better houses and the cycle would repeat, the more they sold the better their access to house stocks. Compensation had to be price/time based which meant aggressive competition between agents which didn’t necessarily mean the best outcome for buyers or sellers. This is where I think online, information rich clearing houses are changing the behaviour of insiders. Now everyone has the same information and knowledge of the market, it becomes more about the skills in matching buyers with sellers more directly. Sales becomes more cooperative based as the insiders can use their shared knowledge of the market to outperform the individual and maximise volume with more informed buyers and sellers.

When a market is opaque to outsiders, insiders can exploit this and make more money by charging a higher premium per transaction. The more inefficient the market the more likely this is to occur. Once that information is available to all, the insider looses that advantage and has to modify their behavior. The knowledge they have is not worthless though, and by reusing knowledge and acting in groups they can exploit and use each others knowledge to again exploit the market better than outsiders. The insider doesn’t disappear but is rather transformed like a phoenix.

Another article on Wired describing how real estate agents sell their own home provides interesting information as well.

 

Touch Typing November 22, 2007

Filed under: development — danielharrison @ 1:51 am

My mother had a few jobs in her career, one of these was teaching secretarial skills in TAFE (Technical and Further Education). When I wanted a computer when I was young, the deal was I had to learn how to touch type first, it wasn’t just a toy.

Photo from Will Davis, portable type writer page

Mum was a canny one, so I had to learn before she would get the computer and I learnt on her portable typewriter, like the one pictured, with it’s big clunky keys that you could jam when you really flew. So off I went, ‘asdf asdf asdf’ to ‘the quick brown fox jumped over the lazy dog’. Eventually I picked it up and I got my computer. It was great, the main benefit being a keyboard with keys you didn’t have to slam down harder as the ribbon ran out and the lack of ink that stained fingers and anything they touched. “Daniel, I don’t understand how you got this mark on the ceiling!”

So onto today, I believe touch typing is one of the biggest comparative advantages a developer can have and I’m surprised just how many developers I see who haven’t actually mastered touch typing. Go learn now! People go to great lengths to use an IDE that works for them and is perfectly tailored, but then when it comes to the raw character entry which is inevitable (at the moment :), they overlook the physical aspects that can improve their development.

The worst thing when you’re in the zone is context shifting, having to stop and either wait for the hands to catch up or not being able to output as fast as you think is a subtle context shift. Basically for me touch typing means I only need to think about the problem and not the output and it’s one of the most important physical skills I have for software development. Nowadays I remember back to the toil and tedium of learning touch typing and can’t believe I did so much to get out of it.

So safe in the fact my mum uses the Internet mainly to exchange pictures of cute animals and forwarded malware and I know she wont read this blog, here it is, yes you were right, I will actually use it.

 

Technical Architect Interview Advice November 19, 2007

Filed under: development — danielharrison @ 1:58 pm

There’s a paradigm I think out there in some organisations that technical consultants/architects don’t write code. In a small product company this just doesn’t hold. If you’re going out to client sites to explain how to integrate you need to be able to talk to the gamut from management to coders.  The success of the project falls to the people on the other side actually doing the implementation and if you can’t guide them and act as a mentor/lead when necessary then the success of the project is at risk which affects the success of the product.

The 20/80 divide between excellent and average means that if you do a rough reversal 20% of people doing the code will be below average and you will need to mentor them and you should be a shining beacon of coding excellence.

That means when you interview you shouldn’t act as though coding is below you.

If you rate yourself as 10/10 in java but haven’t coded in two years and then fail the intermediate/advanced java questions with “I know but can’t remember as it’s been a while since I wrote java day to day” then you shouldn’t have rated yourself as a 10. It’s just going to annoy the people interviewing you. A 7 or 8 with a passion to learn and grow would in my opinion would be more successful strategy.

 

Close To The Problem November 16, 2007

Filed under: development — danielharrison @ 12:59 pm

I work in a small growing company and to our competitors I think we’re a bit disruptive. I was musing the future and something occurred to me, is the reason why small agile companies can be so successful is that they can get closer to the problem than established enterprises? When going up the adoption curve, just after crossing the chasm and being small, light the company structure tends to be pretty much flat with zero barriers to communication. When a company is small and growing you tend to hire after the fact, resources are constrained, so engineers are intimately connected with users because there’s no intermediary. To some extent I think during the growing phase they tend to be almost jack of all trades and the stretch is a good thing. Developers/Engineers are forced to make decisions all the time which have big implications for the success of the software. Being so connected at that stage I think means that the decisions made tend to be the right ones because the technical staff are closer to the problem, hence solutions that are innovative and solve user problems. Is the stretch a function of success or a prime contributor?

This leads to a point in that’s been rambling around in the back of my head, what does this mean for the role of the business/product analyst or more generally a position that owns user interaction and requirement elicitation? Do you really want developers who can’t talk to users or elicit requirements? I think having an intermediary instantly means a choke point. Every meeting and conversation requires the intemediary. User interactions are now owned by a particular person and serendipitous solutions that once occurred because of the close interactions start reducing. Going directly to users for their thoughts means you are encroaching on the intermediarys responsibilities and can pretty easily come off as lack of faith in their judgement or as sidelining. The upside is more predictable release schedules, issue resolutions, easier to build software, … .

Part of my musing was kicked off by 37 signals article on personas, which to me crystallised a number of thoughts I was having. Being close to the problem means you can more efficiently solve user problems. Analysts given responsibility to produce personas, use cases, the whole kit and kaboodle to communicate user desires implicitly turn the focus from what the user wants to the documents, written output as a review process. Reviewers, technical staff, documenters, testers, designers loose the ability they once had to frame the solution from their intimate knowledge of the user, and rely on the persona’s to frame the solution, which could very well be the wrong solution. Review becomes about the ability to build the proposed solutions, it’s schedule, look and feel, testability instead of the naked problem. The people doing the work become too distanced from the problem.

Does this mean I think we should kick out all the analysts like some agile solutions. Well I’m not prepared to go that far yet and I’m not convinced that it’s the best course of action. I think my main thinking is that development and user communication needs to be an inclusive process. Developer’s must be close to the problem.

 

Resharper + IDEA November 1, 2007

Filed under: development — danielharrison @ 11:59 am

One of the things I picked up in my last job (pure J2EE/Swing) was Jetbrains IntelliJ IDEA. It’s a great tool and for me is one of the things that just works. I know a lot of people use eclipse and netbeans, but for me I want something that just works and the best results I’ve had is IDEA. I use netbeans for ruby editing (The 6.0 ruby editing is great!) but as a day to day development environment IDEA for me is focussed on getting things done. I think while the community does develop great things, when an IDE costs money it gives it that extra impetus I find. IBM, SAP and CodeGear all develop on top of Eclipse but the value of the platform is in the functionality they provide on top of it. I’ve used the SAP devtools and IBM WID (Websphere Integration Developer) and to me they’ve always been a little flaky, slow and suffer a bit from multiple personality disorder. The one thing I like most about IDEA is that it’s blindingly fast and light.

It’s a tool that I’ve introduced at my current work and I’ve tried not to push it too much but through people checking out trials we’ve gained slow adoption to about 75% of the team. One really good thing about the philosophy here is that we’re IDE independent, at least on the java side. This means ant, nant and cruisecontrol are our build tools and the IDE choice is up to you. It’s a winning philosophy in that we accept that the skills and knowledge of the developers means by trusting them to adopt what’s right for them will mean that they’ll do what’s right by the product. It’s a bit like shared space road design, if something seems risky, they’ll think about what they’re doing which typically results in better outcomes. Then again our team is very heavily slanted towards senior level developers, so in this context, it works well.

The other thing I do here is .net development and coming back to Visual Studio after a hiatus was a real shock. Visual Studio as I remember was a great IDE, in fact I used to use it to do java development (Microsoft’s JVM + COM) and at the time it was one of the best. After being spoilt I guess by the great quality Java IDE’s out there, it was a real shock coming back and finding that it seemed so similar. Visual studio is a good IDE, but I’m a strong believer in that in IT today, if you’re not going forward then relatively you’re going backwards. The problem I think is that there are literally millions of developers who have never tried anything else, through either no need or no motivation, and it works for them. So to some extent I think changes are evolutionary rather than revolutionary, the peril of customers I guess. I discovered Resharper and it’s been a great help in productivity in Visual Studio. When I use IDEA coding tends to be like morse code, first few chars, ctrl+space. It’s smart enough not to throw the whole world at me and knows what i’m referring to. With touch typing it lets me output code almost as fast as I think and it means I get the feeling my IDE works with me and doesn’t get in the way. With Resharper 3.0 it’s now exact same key combo’s which means I can switch between IDE’s and not have to context shift. The work I do means I tend to be doing both java and C# at the same time which is probably a bit rare. This is an absolute boon for my productivity and I think it’s a credit to the guy’s at JetBrains that they’ve made such a pair of great IDE’s. They just need to get their j/ruby plugin cooking, oh and while we’re on wish lists I’d like a Scala and Erlang one as well, but maybe the Meta programming system will usurp them all. 🙂