.

dyson blade

The fastest most hygenic way to dry your hands. what a strange way to line extend your brand. on the other hand it was pretty effective

???????? ????? ????????


kick ass pizza

At dough’s (named after owner doug horn) on blanco. made in a special oven brought in from italy … Yum!


a to z

Aristocratic wines at democratic pirces- it is a seriously drinkable pinot noir. i was suprised to find it so available here in san antonio. according to our waiter it is owned by greg popovich – mystery soled! Turns out better wine than the first two games of the western conference final :(


grill it up

Time to fondue a lot of good meats


A Lesson From Leo

Happy families are all alike; every unhappy family is unhappy in its own way. – Leo Tolstoy

It is funny – If I look back a year ago I had two very big software projects (since I can’t say what they were we’ll call one Y and the X). Y went really well, and by all accounts is a big success! What about X – not so much. It is slowly turning out that the second project is going to have a much more lasting effect on my design and thought processes than the first. I mean I haven’t worked directly on X in months – it is still teaching me lessons. Proving not only that I have a lot to learn – but also that your mistakes can teach you a lot more than your successes.

I recently gave a presentation on a number of lessons I learned from the project ( A couple of those ideas I’m hoping to sit down and actually write out for posterity – and hopefully as a reminder so I don’t repeat them). But just yesterday, I got another lesson out of X! It really is the gift that keeps on giving.

My World

First, let me say that my development world is fairly constrained. Although I have worked on some seriously scaling infrastructure – by and large my focus isn’t on those kind of problems. I always end up focusing on really complex domains. I largely credit one book with really having an impact – Domain Driven Design. I first read it back in 2004. It does what a great book of this should do – it introduced me to great new ideas and also confirmed things I was already doing (plus giving me a language to finally describe it).

I make this point – because it means I spend a lot more thinking about what if represented in a given system (and how it is represented) more than I do about about many transactions the database can handle. Because of that – the details of the domain are seriously important.

Actual vs Logical

Let’s talk about something that is totally outside of the domain I work in. How about cars?

I have one – it has the best feature of all the cars on the road – it is paid off! (I highly recommend that as a feature).

I have VW Beetle.

For most circumstances – that is plenty of information. Meaning – I have a car. If you want to know if I can go some place – that is probably enough information.

When it comes time to go to lunch – we have to figure out who is going to drive. My car can fit 4 people. That detail is important depending on how many people want to go.

If I want to find out how much my car is worth on Kelley Blue Book I have to add the details that it is a 2000 GLT Turbo (plus lot of other details – 5 speed, seat warmers, etc).

The point of this is – if you asked me about my car – what you want to know about it is driven by the problem you are trying to solve by asking the question. If you ask:

Can you get to work?

Yes, I have a car. (Totally sufficient answer)

Yes, I have a 2000 VW New Beetle GLS Turbo Hatchback 2D with a 5 speed manual and seat warmers. (Is a bit nutty)

I’ll say that this is a scale from the logical to the actual. Logical being a sort of encapsulation of details that provides very limited information to the actual which has a ton of details.

Some examples:

Logical Actual
Large Coke 24oz Wax Coated Paper Cup – Coke with 35% ice , lid – no straw
31D 31D on an American 777-200 Ver. 2 – which means it is an awesome seat
Some Chips Corn, Vegetable Oil (Contains One or More of the Following: Corn, Soybean or Sunflower Oil), Buttermilk Solids, Salt, Tomato Powder, Partially Hydrogenated Soybean Oil, Corn Syrup Solids, Corn Starch, Whey, Onion Powder, Garlic Powder, Monosodium Glutamate, Cheddar Cheese (Cultured Milk, Salt, Enzymes), Nonfat Milk Solids, Sugar, Dextrose, Malic Acid, Sodium Caseinate, Sodium Acetate, Artificial Color (Including Red 40, Blue 1, Yellow 5), Spice, Natural and Artificial Flavor, Sodium Citrate, Disodium Inosinate, and Disodium Guanylate.source

I realized one of the problems with Project X was exactly this problem. Namely that it collected enough information to drown an actuarial. (Amazingly there this an entire site devoted to jokes about actuarials – www.actuarialjokes.com).

Why did it collect so much detail?

Data Pack Rat

I’m a 5th generation pack rat in the real world (I’ve been through some counseling and my wife has helped a lot), but I’m ten times worse in the digital realm. I collect data – of all sorts. This inevitably bleeds over to the things I work on.

Solving The Hard Problems

That was not the real reason – even if it is a convenient answer. The real reason is that the really tough problems in the domain for X required the data. Sure the day to day stuff only needed to know the basics – but the hard problems are hard precisely because they depend on knowing the details involved. This is the same reason that Kelley Blue Book needs so much detail about my car – it is hard to give an accurate estimate if you don’t know what you are estimating.

CSI & The Infinite Zoom

This is the other part of the reason why getting the actual can be so important. On CSI, they strive for what seems like scientific accuracy (One can only assume as a layperson). That being said – there is one situation where all the rules are thrown out. They love to zoom into pictures infinitely – even when they are digital. The worst case was the time they used footage from a security camera to get a picture of someone’s eye – and then look at the reflection to catch the killer. Going to have to get one of these futuristic security cameras – since everyone I’ve ever seen is seriously blurry.

What is the point of that little rant? Simple, if you don’t have the resolution you can’t just make it up. The same rules apply. If you only collect the make and model of cars (like they did to get my parking tag) you can’t really say how much it would be to replace that vehicle. If that is your goal – you’re screwed. My solution is typically to think about all the data we are going to need and work backward.
This doesn’t always work out – Project X being an example.

The Thin Vertical Slice

There is another complication in this whole process. We do a lot of Thin Vertical Slicing. It is incredibly useful since it means we get better feed back from the users and it keeps us from building a lot of stuff we don’t end up needing. The down side is that when you are dealing with a lot of data that has to be collected (i.e. a world wide audit of something) – you have to decide – do you collect what you need now – or while you are there – do you grab as much as you can while you are there. Once you grab it – what do you do with it. Since if you track it but don’t include it in the tool you roll out – how do you keep the data up to date? If you ignore it – you’re going to be back to audit the rest eventually (or maybe you won’t? YAGAI (You Aren’t Gonna Audit It) a subset of YAGNI)

Matching The User

This is the real crux of the issue. Both Project X & Y obsessed over the actual. Project Y was a success – X was not. Why not? At the end of the day, a simple answer – matching the domain data to the user.

In the case of Project Y, the users of the system were also obsessed with the actual data. The tool let them manage the data down to the smallest detail. Which made them more accurate and more effective. They felt no sense of overload because they were steeped in the details as part of the job. The tool just made it easier to switch to a new task. Awesome!

In the case of Project X, the users of the system didn’t care as much about the actual data. They just needed a couple of logical collections to make their decisions. The tool forced them to wade through so much data to get something done. Which meant when you were doing something very very complicated it was great – but most of the time things weren’t complicated – which means it sucked.

The lesson here is that we could have had both. We could have had the detail underneath (because we actually needed it to solve problems), but exposed less of it unless you wanted to dive in. We actually moved this way at the end of the project (some what unconsciously) – but by that time – other issues with the system were in the way of success.

Final Thoughts: So basically – keep in mind the problem you are trying to solve and the data required to do it, but overwhelm your users with that data at your own risk!

p.s. There is probably a corollary to this idea relating to integrating applications via services. I haven’t finished this idea – but its probably along these lines (Almost stolen verbatim from a friend). Namely – that when you are consuming services – focus on what you need – and nothing more. Meaning :

  • Don’t validate data from the service you don’t care about. Since you don’t want to throw out a response because it includes stuff that doesn’t effect you
  • Don’t ask for more data than you need – since you’re wasting resources and confusing the issue on what data you actually care about

Both things are part of the process of keeping the services from becoming to intertwined, but still manage to loop back to this idea of getting the detail level right.


they just got engaged

Yup -this ad just showed up and I thought- weird they must have the wqrong theater… Then a dude showed up with a ring- congrats nathan (now i just hope indiana jones doens’t suck and destroy the rest of their relationship)


the best pastrami in sa

Marty’s new york deli at 1604 & blanco – seriously good


Anyone Heard Of Varnish?

I was doing some research on caching reverse proxies – related to some REST stuff. I figured I’d end up at Squid – since it seems like the standard for this sort of thing. Then I hit an article that got me to

Vanish:

Varnish was written from the ground up to be a high performance caching reverse proxy. Squid is a forward proxy that can be configured as a reverse proxy. Besides – Squid is rather old and designed like computer programs where supposed to be designed in 1980. Please see ArchitectNotes for details.

I have no idea how well it works but two things: The first was funny – the second interesting

From the FAQ:

Does Varnish require the system to have a C compiler? ¶

Yes. The VCL compiler generates C source as output, and uses the systems C-compiler to compile that into a shared library. If there is no C compiler, Varnish will not work.

.. Isn’t that security problem? ¶

The days when you could prevent people from running non-approved programs by removing the C compiler from your system ended roughly with the VAX 11/780 computer.

The second was a discussion about how Squid’s basic architecture is simply wrong for the problem it is trying to solve.

Take Squid for instance, a 1975 program if I ever saw one: You tell it how much RAM it can use and how much disk it can use. It will then spend inordinate amounts of time keeping track of what HTTP objects are in RAM and which are on disk and it will move them forth and back depending on traffic patterns.

Well, today computers really only have one kind of storage, and it is usually some sort of disk, the operating system and the virtual memory management hardware has converted the RAM to a cache for the disk storage.

So what happens with squids elaborate memory management is that it gets into fights with the kernels elaborate memory management, and like any civil war, that never gets anything done.

When I have more time looks like something i’m going to have to try out…


Sometimes the simple way is best

Testing flash.now with RSpec – Xavier Shay’s Blog

This turned out to be the simplest way I could find to easily test flash.now – my only addition was a simple helper method


def keep_flash_now
@controller.instance_eval { flash.extend(DisableFlashSweeping) }
end

It makes it easier to understand what is going on in thest.


Jake to the Rescue

Ok so I’ve been dealing with an old app for the last two weeks. It is woefully out of date (several libraries are behind – not the least of which being rails). And it looks like at some point I threw discipline out the window and added some stuff without writing tests (before or after adding in the code).

On top of all of that, I’m back in the guts of it because I need to add a pretty major new feature – and the state of the code has me running for the hills.

I got all the tests to pass last week – but doing some coverage analysis I found that there are big sections of the code that are not tested at all. (Yes I know 100% coverage doesn’t mean 100% bug free – but it will generally alert you to the obvious bugs).

In the process, I was trying to get Rcov to sort some of the results for me so I can focus on the worst offenders. In the process of figuring out how to do that I stumbled on Metric Fu.

Metric_fu is a set of rake tasks that make it easy to generate metrics reports. It uses Saikuro, Flog, Rcov, and Rails built-in stats task to create a series of reports. It’s designed to integrate easily with CruiseControl.rb by placing files in the Custom Build Artifacts folder.

It’s really cool – basically it not only made it seriously easy to sort out the coverage stuff and works well with Rspec. It also showed me some new tools for finding other code smells. It alerted me to a particularly nasty controller method that is apparently doing a hell of a lot more than a controller method should do.

The coolest part of this is that not only is it a neat tool – but I actually know this guy. We worked together on a project a year and a half a ago. Small world I guess.


    Stuff I want to read

    Shelfari: Book reviews on your book blog

    Stuff I've Read

    Shelfari: Book reviews on your book blog
    You are currently browsing the Economy Size Geek weblog archives for May, 2008.
    Categories
    Archives

    .