Wednesday, June 15, 2011

Going Solar (It Rocks!)

Almost a year ago, we went solar on our house. I tweet quite a bit about it and get asked lots of questions. It has taken a year, but I finally got around to putting this together to help answer some of these inquiries. My hope is that it will prove informative to all who are looking at solar on their own house.

We got quotes from the following companies:

Namaste - has lease option (SunRun)
REC solar - has lease option (SunRun)
Lighthouse - has lease option (their own leasing plan)
SimpleSolar - Did not have a lease option
RealGoods - has lease option (SunRun)

In addition, I contacted Bella, but they never got back to me.

I originally intended to purchase the system outright. Doing so, you get a 30% Federal tax credit, a Colorado State tax credit for your sales tax, and a rebate from Xcel based on the size of the system.

The Xcel rebates are based on meeting some predetermined goals as far as amount of electricity generated by solar systems. The rebates are based on two components. A flat payment per watt plus an additional Renewable Energy Credit (REC). The rebates are on a sliding scale which is decreasing over time. Had we purchased, we would have got in at $2/watt + $0.55/kW (and this is what Xcel paid the leasing company). Xcel is currently paying $1.75/watt + REC of $0.04/kW. See Xcel Solar Rewards for more info on this. If you are thinking of going solar, get in while the getting is good (the great getting is already in the past). A final note, Xcel will only reimburse you for systems that are less than 10 Kw and less than 120% of your last years use.

If you generate more power than you use, it is banked on a monthly basis. In January, if your bank is positive, you have two options, A.) keep it banked, B.) Sell it to Xcel for the Average Incremental Cost of Electricity (AHIC). Previous years AHICs:
  • 2010 - 3.000¢/kWh
  • 2009 - 3.058¢/kWh
  • 2008 - 4.842¢/kWh
  • 2007 - 3.414¢/kWh
  • 2006 - 4.291¢/kWh
Note: If you choose Option A, it is forever, you can't go back. We have currently waived the decision, defaulting us to B. Given that we generate significantly more electricity than we use, banking the extra doesn't make much sense for us (we would never use it), so we will likely continue with option B in the future.

Even with the rebates, these systems are expensive. After looking at the cost of financing, we determined the payback would be minimal. So we looked at the leasing option. This turned out to be lucrative for a few reasons.

Here is why I like the lease:
  • Low upfront payment, fixed monthly payments for life of lease
  • Lease is for a guaranteed amount of energy generation per year
  • Leasing company takes care of insurance and maintenance of system. The inverter is going to go out in 10-12 years, they will have to replace (around $3500). Hail damage, they have to repair or replace.
  • If we generate more power than we use, it is ours, even if it exceeds the guaranteed amount. This extra can be banked or sold back to Xcel at our discretion.
Given the above, some things to consider with the leasing option:
  • Leasing company gets all of the tax credits and rebates
  • Slight headache if you sell your home, in that you need to get your buyers qualified for the lease (or prepay lease, or buy out system). It looks like SunRun makes this very easy, so this is a minor point.
As noted above, with one exception, all of the companies we talked to used SunRun as their leasing provider. Since we signed on the dotted line, two other leasing companies have arrived on the scene, Sunergy and Solar City. I do not have any experience with either company, but probably worth talking to them. Also, it is worth noting that these leasing companies only operate in states where the utilities have programs to make it lucrative to install systems. I know this after looking into the issue for some relatives in WY and UT.

The system we chose was a 7.56 kW system, installed by Namaste Solar and leased from SunRun. I was very happy with every company I talked to (except for Bella, who couldn't be bothered to even return my call). In the end, the decision was based on two things. I asked each company to "max" my roof out, figuring that as long as I was jumping in, I should go all the way. I also ran financials on each proposal, factoring in the downpayment, the leasing cost, our average usage based on the past year (1100 kWh per month), and assumed a 5% annual increase in the cost of electricity.

The Namaste proposal won out on both counts. My simple (and it turns out, conservative) financial analysis indicated we would save $20K over the 20 years. This system had the largest downpayment ($1800) due in part to the need to move a vent on the roof to accommodate the panels. The vent movement was the tactic that allowed them to propose the largest system. The system is guaranteed to generate 11 MWh a year (after 10 months, the system has averaged 1 MWh/month, so looking at 12 MWh+ for first year), which is approximately 80% of our 2009 average usage. Payments are $76 a month for 20 years.

To compare, this system would have cost us $28K to purchase outright. I am pretty sure this is after Xcel rebates, but before tax credits, but those details are a bit hazy, as I stopped looking at the purchase very early.

One other nice to have from Namaste that none of the other installers quoted was a screen guard around the base of the panels on the roof. This will keep critters out from under the panels and off of our roof.

So how are we doing? The system went live the last week of June, 2010. Since then, we have been net producers, generating more electricity than we have used. Since turning the system on, we have generated a total of 10,204 kWh energy, of which 3,338 kWh were put back on the grid (as of May 2011). Translation, for the past ten months, we have generated more power than we have used, by a significant amount!

But wait, I mentioned that the system was only sized for 80% of our average use last year? Well, at the same time that we got the system, we made some changes around the house to reduce our usage. Anyone who knows me knows that I like it COLD. Normally, my AC thermostat would be set to 70 and sometimes 68. This summer, we set it to 74 during the day and 72 for four hours in the evening. I know that this doesn't sound like a sacrifice, but those closest to me (and those who have observed how I dress in the winter) know that I have been less comfortable than I would have liked. The other change we made was to use small fans around the house to recirculate air instead of the HVAC fan. These changes have resulted in a considerable drop in our energy usage at the same time that we have added generation capability to our home. Net win!

Prior to getting the system, our electric bills averaged $120/month. Since we generate more than we use, our new bill should be just the lease payment minus any money we get from Xcel buying back the excess. This is almost the case, but there is one added fee, which is an $8/month charge for being connected to the Xcel grid. There has been a lot of complaint over this fee from some quarters, but I can understand the rationale. I am using Xcel's equipment both to transmit my excess to the grid and to augment my supply when the system is not generating. I think the fee is reasonable and even taking it into account, we are still already money ahead of where we were this time last year.

Finally, let me describe the system. On the roof, we have 24 SunPower 315-watt panels. These panels are all wired up and feed DC power into an inverter on the side of the house. This unit converts the DC power to AC power. This is fed into a meter used by SunRun to monitor production. A second meter follows the SunRun meter and is owned by Xcel to do the same thing (because they don't trust SunRun's meter!). Following this, the power is fed into our breaker box. The main house meter was replaced with a NetMeter (it runs forward and backward). This last meter is the one we are billed from. In addition, I have added a Ted5000 system to allow me to monitor our power generation and use in real time. The whole thing looks a bit Rube Goldberg in nature, but it works!

So during the day, the system generates power if the sun is shining. Namaste sent me an informational email informing me that the system would not generate power at night or on extremely cloudy days. That was almost a deal breaker, but we decided to continue! ;-) At night, we pull our power from the grid. As long as we pull less power from the grid than we put onto it during the day, our house is a power generation plant.

One more item to note. We do not have a storage system (batteries). Such systems are extremely expensive and have to be periodically replaced. The grid becomes our storage system, which is fine, but there is a caveat. If the grid power goes down, our system will shut off. This is to ensure that our system does not put power back onto the grid while someone is working on it, causing injury or death. So when the power is out for our neighbors, it is out for us.

I will continue to modify this as I receive questions and clarifications. I hope it provides useful information on going solar! Feel free to send questions my way and I'll help if I can.

If you choose to go solar and select Namaste and/or SunRun, please mention you heard about them from me as they have referral bonuses, and hey, any little bit helps.

Sunday, February 07, 2010

My Ignite Talk Experience

20 slides, 15 seconds per slide... a 5 minute talk. How hard could it be? Famous last words!

That was what I signed up for when I asked to present at the Denver Open Source User's Group Lightning Talk Night.

I had my topic in mind. I wanted to provide an overview to the basics of estimating tasks in an agile development. There are many things I find compelling about agile development philosophies, but none more so that the elegant simplicity of how estimates are made and progress tracked.

It might sound strange, but the first thing I focused on was a title. My thought was that for a five minute talk, the title would be an important way to set the tone for the talk. I started thinking about what I would be talking about and the title just seemed to fall in place. I took this as a good sign.

I then proceeded to procrastinate for three weeks (actually, I was slammed at work, but still...). The talk was a couple of weeks away and I knew I had to get cracking. Evidently, my subconscious agreed and had actually been on the job, because I woke up at 3 AM one morning with the entire outline of the talk running through my brain. I quickly grabbed my iPhone and started frantically typing into the notes App while everything was still fresh. I was surprised when I reviewed it the next day to find that it was a good outline. The final one did not deviate much from the original.

With outline in hand, I then started making my slides. I tried to go for a very simple title that conveyed the message of the slide and a graphic that supported that message, albeit in an abstract manner in some cases. Since I wanted to release the talk under Creative Commons, I had to search a bit for appropriately licensed media. FWIW, there are some very talented folks on the interwebs that totally ROCK!

After I had the slides completed, I wrote a script for the talk. This consisted of writing 2-3 short sentences for the slide, then speaking the sentences while timing to see if they fit within the 15 second allotment. There was a lot of iterative editing in this step. I wound up with a talk that was, in hindsight, a bit denser than it should have been. It fit the time limit, but barely.

I practiced the talk and thought I had it down fairly well. I had the script for each slide on a 3x5 card, just in case something happened. In practice, I found that glancing at the card was sufficient to guide me through the talk when I got stuck.

Then came the live performance. It was not the rousing success I had hoped for. I slipped up on my timing early in the talk. At 15 seconds per slide, it is very hard to recover from this, particularly when your talk is info-dense. I wound up speed reading my cards to keep up. I'm not saying it was a complete failure, but I would have liked it to go a bit smoother.

As far as the night in general, there was some amazing talks on a lot of different topics. This format is such a great venue for getting a peek at many different ideas and technologies. Slides for the talks are posted at DOSUG Slides.

A couple of days after the talk, I got the chance to reprise it at Tendril. It went a lot better (still not perfect).

The whole experience has been both educational and fun. If given an opportunity, I highly recommend you give it a try.

You can review my talk, with audio, here. Let me know what you think.

Also, full disclosure, the recorded talk is slightly longer than the 5 minutes allowed. This is mostly due to the fact that the introduction was not counted as part of the talk.

Agile Estimation: The story has a point, but the point is unitless

Sunday, November 22, 2009

No Fluff Just Stuff, Fall 2009

Another year, another NFJS behind me. You would think by now it would be old hat, but it seems like every time I attend, by the end of the weekend, I feel reenergized and ready to crank on some more code. My LOTTD also seems to get a lot longer!

Jay Zimmerman, the founder of No Fluff, just gets it and always manages to put together the perfect geek conference. The speakers are top notch, the topics are relevant, and the food is great. This year, the wireless networking was top notch as well!

Before I get into the sessions I attended, I want to go into one of the less tangible benefits of the conference that folks thinking of attending may not think of, which is the networking opportunities. My weekend started with lunch with some friends, one who is a speaker, prior to the start of the conference. While we don't focus our conversations exclusively on technical topics, they tend to dominate. In this case, I learned a lot about GIS that I was not aware of before.

Similar things happened during conference meals. The speakers are great for mingling, rather than being clickish, and again the discussions were very informative. One of the memes this year was whether speakers should use live coding or pre-recorded demos in their talks. The two camps had strong opinions on the issue and there was a lot of friendly ribbing that went on between them.

Personally, I got to see the good and bad for both approaches. In one talk, the speaker found that some web services APIs provided by Yahoo! he wanted to demonstrate had changed, which broke his demo. He was a bit embarrassed by this and it is possible that some folks in the session were unhappy, but I felt that most attendees were very engaged and it became a group debugging sessions, which in itself is a great learning experience. In another session that used live coding, a question from the audience sent the talk in a slightly different direction. Had the demo been canned, this would have been harder to do.

On the other side though, one of the speaker's Mac crashed right before he was scheduled to talk. Because his talks were canned, he was able to borrow a laptop and go through his talk without missing a beat. If he would have needed to install all of the tools used in his demo, it would have been a massive FAIL.

Jay summarized the debate saying if everyone agreed, someone would be redundant!

And now a summary of the talks I attended...

Spring 3, Groovier Spring - It has been a couple of years since I attended a Scott Leberknight talk. He has kept his tech edge and did great job of reviewing some cool features in Spring 3 (and some lesser known features from Spring 2.5). I'm not a Spring power user, but have some grand plans for incorporating Spring into the product I am currently working on. The plan is to use Spring Dependency Injection to allow third parties to extend the product without the need to access our source code. None of the features discussed in the talk seemed to add any capabilities that I would immediately need for this. My biggest take away is that the Spring platform is very powerful, but borders on being overly complex as well.

That being said, there are some cool features available in Spring. Annotations seem to have taken the place of most XML config. In general, I like this. Spring has added extensive support for ReSTful web services, including a work-around for the fact that browsers do not support the PUT and DELETE primitives. Spring has great support for TDD, including Junit 3 and 4 as well as TestNG. Transactional processing support is extensive, again the idiomatic use appears to be via annotations. The other big take away from this talk was that you can add a global exception handler to a Spring-based system.

Scott also presented information on a groovier Spring. I like Scott's definition of Groovy...

"Java minus all the code I don't like writing plus closures, metaprogramming and more."

SpringSource acquired G2One last year (SpringSource has since been acquired by VMWare). Based upon the first acquisition, a lot of folks felt that Groovy (and Grails) would get baked into Spring. These people were correct! Spring now has a a script factory object that supports beanshell, jruby, and for the purposes of this talk, groovy. Scott presented great examples of how to incorporate groovy scripts that can be run by the Spring app. There are several (too many?) mechanisms that can be used to do this. Fortunately, one of them appears to be the most straightforward. There are some security issues that need to be accounted for if you want to use this mechanism. In one example, the script is pulled out of a DB. Injection attack anyone?

Scott likes to open his talks with a joke, typically some kind of pun. This time, he started with an oldie, but a goodie. What is the object oriented approach to getting rich? Inheritance! But someone in the audience threw a curve ball I had not heard before... Marry a rich Singleton. LMAO!

Grails w/o GORM and ReSTful Grails - Scott Davis, the person who introduced me to Groovy and Grails, did a great job on these two complimentary talks. Both talks focused on using ReST in Grails, but from different perspectives. The first talk was about how to consume ReSTful web services. Scott demonstrated accessing Twitter, Yahoo Search, and a GIS web service from within a Grails application. The code to do this was surprisingly simple.

From the other side, Scott demonstrated how to build a Grails application that exposes a ReSTful web service. His focus in this talk was that while the UI is important, don't forget the data! As expected, it is very straightforward to implement a ReSTful interface in Grails. It turns out that there is a bit of boilerplate involved, so Scott recommends moving much of this to template functions.

I blogged about ReST last year. It is really taking off as the web service technology of choice. Scott compares ReST, which is resource (noun) oriented, to SOAP, which is service (verb) oriented. He presented stats which show that give a choice, consumers of web services overwhelmingly choose ReSTful interfaces to SOAP based ones.

Polyglot Persistence - Another Scott Leberknight talk. There is evidently a movement afoot, the No SQL Movement, which advocates the position that while SQL is not Evil, it is often not appropriate. In this talk, Scott presented some potential alternatives to the traditional RDBMS. These include document-based (CouchDB and Amazon SimpleDB), hash table (Project Voldemort), and big table (Google Big Table and Apache HBase) approaches.

I'm not a persistence guy. Most of my experience is with some simple Hibernate apps and GORM in Grails. It was great to see what else is out there, but my biggest take away from the talk is I'm not smart enough to work at Google. That Big Table stuff is pretty deep!

Git - Matthew McCullough has been on the No Fluff circuit for about a year and is really making his mark. He is a great speaker and really knows his stuff. He is also the speaker who had to recover from a crashed Mac.

Git is a distributed version control system written by Linus Torvalds of Linux fame that is quickly becoming the VCS all the cool kids are using. While an argument could possibly be made that some of this is hype, I think there is enough value add to the product that the hype is justified. In my case, the value is the ability to have the entire repository stored locally on my box, allowing me to work offline and still have access to the entire history. Matthew did a great job demonstrating how to use the Subversion to Git gateway. I am looking forward to trying this out.

The syntax is a bit quirky, as might be expected from the inventor of Linux, but it is easy to do the simple things and possible to do some powerful things!

Open Source Debugging Tools - Matthew provided a whirlwind tour of a dozen tools for debugging Java applications and another dozen or so tools for debugging web apps.

In the Java space, he provided summaries of tools for process debugging, stack and heap dump analysis, disassembly and decompilation. One of the tools that really excited me was the Eclipse TPTP comprehension tool. This tool monitors program execution and creates UML sequence diagrams that reflect the actual execution of the program. This tool can even show execution of dynamic methods created in Groovy objects. I asked about tying this tool into a CI server for automatic creation of developer documentation. The product is not there yet, but I can see this happening in the future.

In the web app space, the tools of note (for me) were curl, firebug, and firebug lite. I hope to find some time to learn these better. Matthew made the point that many developers find System.out.println to be the end all and be all of debugging. I definitely don't want to be in that camp!

Refactoring - Neal Ford has been a mainstay on the No Fluff tour since I have been attending (4 years now!). He often focuses more on the bigger picture of software development as opposed to specific technologies. This talk fits this description really well. Some of this was review from what I had learned reading the seminal refactoring book by Fowler, but there were a couple of good take aways from the talk.

Prior to refactoring, all that you typically have is piles of undifferentiated code. Following the techniques Neal presented, including the composed method and template patterns, will result in code that is more extensive and maintainable.

One soundbite I took from the talk was "It is OK for the framework to know about your code, but your code should not know about the framework".

Collections - This session was presented by Ted Neward, master of everything low level on the JVM (other stuff as well). I have been using java.util.* collections quite a bit lately. I have been frustrated by the lack of some seemingly obvious features, such as creating a list from an array. I went to this class to see if I was missing something. To some extent, I was. The Collections and Arrays classes have some of the features I was looking for, albeit with some warts that still make them less useful than they could be. And some features are still missing (I cannot directly create a List from a List). One take away from the class is I need to go look at Commons Collections.

Ted joked that every time a Java programmer uses an iterator, a kitten dies. Proper use of collections leads to a more functional programming style. I'm not really a cat person, but don't advocate cruelty either, so I really want to move that direction!

FWIW, closures and Groovy are another means to this end as well. So I'm already on the way.

JMS - Another Ted talk (pun intended). I had some very brief exposure to the JMS messaging bus a couple of years ago, but never got into the details. I try at every No Fluff to get some exposure to a technology or two that I may not necessarily need, but want to have a passing understanding of. So my take away from this session, JMS is a standards-based message passing protocol that handles both point-to-point and pub/sub messaging models. One of the benefits (provider dependent) can be guaranteed delivery. The API has some quirks, but seems pretty straightforward. There are many JMS providers out there. Ted stressed not coding to a specific provider jar, but rather compiling with the jar shipped with the JDK, then deploying with the provider jar to ensure you are adhering to the standard. The biggest disappointment was finding out that JMS messages from one vendor are not typically cross-compatible with other vendors.

So that's a wrap for this year. My head is full, there aren't enough hours in a day, but even with these constraints, I always feel like I have improved as a developer just by attending the show. And after I spend some playing with some code... look out!

I'll close with a teaser. Jay is going to have some really cool news for release in the next couple of week. I'll tweet/blog when he does. I don't want to steal his fire, so until then, I will leave you in suspense!

Sunday, March 08, 2009

Brain Quenched, Actually Soggy to the Point of Leaking.

In my last blog, I suggested that if you wanted to learn Groovy or Grails, that Scott and Andrew at ThirstyHead were the people to talk to. Well, last week I put my money where my mouth was and attended the inaugural class from ThirstyHead, "Advanced Grails", taught by Scott in Lone Tree, CO.

With the exception of a hard drive crash on my part that resulted in switching laptops mid-class, everything went fantastic. I really learned a lot in three short days.

I have extensive experience working at many levels of software development, from writing assembly and embedded C on 8-bit processors, to developing hard real time systems using a commercial RTOS (in C & C++), to multi-threaded Java development. One co-worker recently called me the Swiss army knife of programmers. I really liked that description. But the compliment of tools in this knife is not complete by a long shot and I am always looking to add new ones. Web development is one item lacking from my repertoire. Enter Grails.

A web framework targeted for the JVM, most first reactions tend to be "Hmmm, yet another web framework". The distinction Grails brings to the picture is a framework that embraces convention over configuration, dynamic programming, built-in support for testing, and tools that fit well in an agile shop, all this in a framework that is inherently Java friendly. While there is no arguing with the fact that Grails and it underlying language, Groovy, borrowed a lot of inspiration from the Ruby on Rails framework, it is relevant in its own right due to its strong integration story with Java. For a beginning look at Grails, check out Scott’s Mastering Grails series on IBM DeveloperWorks. There are also some good books out there, Grails in Action and The Definitive Guide to Grails. After you are comfortable with the tutorials, you have the basics under your belt, and you want to move to the next level, Scott’s class should be a prime consideration.

The goal of the class was to do a deep dive into the inner workings of Grails. In Scott’s words, by the end of the class, he wanted us to not only know how to use Grails, but how to make it sing… to a tune of our own selection. I think he got the job done.

We started with a deep dive into skinning a Grails app. Step one was looking at some existing CSS templates that are available via Creative Commons. After choosing one, we learned how to skin our app with it, modify the stock GSP (Groovy Server Pages, similar to Java Server Pages). In true agile fashion (Scott is an expert on Agile development as well), we added a feature at a time, first adding the correct tags and labels to the GSP and then learning how to break it into reusable templates. The View aspect of the MVC pattern is one of the areas I am weak in, so this section was extremely helpful. I will be attending the AJAX spike offered by ThirstyHead in April to further expand my knowledge in this area.

The next section covered Authentication and Authorization in a Grails application. Scott demonstrated, in a remarkably small amount of code, how to roll your own basic A&A module using Grails interceptors. He followed this up by introducing two popular security plugins, Authentication and JSecurity. Authentication looks fairly straightforward and provides some bells and whistles that would require more effort to develop from scratch. JSecurity appeared to be the nuclear option for security, powerful, but complicated.

Scott spent considerable time covering how Grails supports ReSTful web service support, including time spent looking at other ReSTful web sites. In the special case of sites that only provide read access to their data, Scott has coined the term GETful (since the only HTTP verb supported is GET) to describe the interface. ReST is becoming extremely popular due to the ease with which it can be implemented without having to compromise application functionality. Grails makes a ReSTful app very easy to implement.

We then covered several miscellaneous topics, including codec development (allows data conversion services to translate data to and from strings), operator overloading (supplied via Groovy), file upload, supplying syndication services from a site using Atom, indexing using Lucene and Compass, and integrating support for e-mail and JMX monitoring into your application.

We finished up the last day by digging into the details of the Grails build system, which is based on Gant, the Groovy DSL that wraps Ant. In this section, we learned how to extend the build system to perform custom actions during a build (like packaging special files into the jar or war file).

This was a LOT of info to fit into three days. Surprisingly, at no time did I feel that Scott had to rush things and we even had time to take a couple of detours based on questions from the class. The most memorable was when we figured out how the Twitter ReST interface worked. There was ample time to work on the lab exercises, which were written to build upon and reinforce the lecture material. Scott was always available when things went horribly wrong. The labs really brought everything home.

Another cool take away from class was that Scott bundled up all of the code he wrote during his lectures. Most of this code is commented to help remind us what the key points were for that code. This code is going to be a great resource as I start to use the material I learned.

Given what I already knew about Scott, his knowledge of the subject matter, excellent speaking skills, and cool sense of humor, I can’t really say I’m surprised about how well the class went. I’m looking forward to attending the tech spike in April.

If you are in need of a quick jumpstart in any of the topics they offer, try ThirstyHead. You won’t regret it.

Wednesday, January 14, 2009

Quench Your Head's Thirst!

I am pleased to report that a good friend and really smart guy, Scott Davis, has formed a new company, ThirstyHead, with another great industry icon, Andrew Glover.

The company will provide top notch training focused primarily in Groovy and Grails initially, but if I know anything about these guys, we can expect to see other great offerings as well. Besides their excellent knowledge in all things Groovy and Grails, Scott is the GIS guy in the Java circles (he wrote the book!) and Andrew has done some amazing work in Behavior Driven Development (he wrote the code!).

If you have not had the chance to see Scott talk, you are truly missing a great learning opportunity and a fun time. Scott's speaking style is very informal and engaging. He is very passionate about the subjects he talks about and draws extensively from his own experience.

I have only had the opportunity to attend one talk by Andrew, but have listened to many of his podcasts on JavaWorld and read his entertaining Disco Blog. Based upon what I have heard, I think the Davis/Glover partnership will go far.

If you want to quickly bootstrap your understanding of cutting edge technologies that are bringing rapid, dynamic development tools to the Java platform, ThirstyHead is the place to go! I owe a lot professionally to Scott and wish him and Andrew much success in this endeavor!

Sunday, November 23, 2008

The geeks were back in town...

Jay Zimmerman and crew descended upon southern Denver for the Fall 2008 No Fluff Just Stuff symposium. And I had the opportunity to attend.

This is my fourth No Fluff and as usual, it was great. Eleven hour-and-a-half long sessions, a great keynote, and an entertaining and informative panel discussion, coupled with some great networking filled the weekend. My brains were leaking out of my ears by conference end Sunday evening.

Here are the sessions I attended (as always, the biggest complaint I have is that there were some hard choices to make when choosing sessions):

Beginning Drools - Rule Engines in Java, Brian Sam-Bodden
What's Going On? : Complex Event Processing w/ Esper, Brian Sletton
Rapid Web Development with Grails and Ajax, Scott Davis
Boosting Programmer productivity with Mylyn, Brian Sam-Bodden
Design Patterns in Dynamic Languages, Neal Ford
Java.next #1: Common Ground, Stu Halloway
Architecture and Scaling, Ken Sipe
Git control of your source, Stu Halloway
Agile Test Driven Development With Groovy , Jeff Brown
Real World Hibernate Tips, Scott Leberknight
Google Your Domain Objects With Hibernate Search, Scott Leberknight
Keynote - Soft Skills and Organizational Dynamics, Ken Sipe

Here are some rambling take-aways from the sessions. The goal is to peak curiosity, not repeat the preso. Feel free to drop me some e-mail or leave a comment if you would like to discuss further.

The first two sessions I chose based upon professional curiosity, since, in a past life, I worked on a commercial Complex Event Processor. Rules engines are similar in nature, so I wanted to compare the two. Turns out, the biggest difference is that in a CEP, time is a de facto part of the system (you handle data in time order order the vast majority of the time). By default, the Drools rules engine processes rules in the order they were entered into the system. Another difference is that typically, a rules-based system is used when you already understand the business logic you wish to use for processing the data, whereas a CEP system might be used to discover what those rules are. Keep in mind that these differences are a bit fuzzy. I believe you could use either system to solve most problems without any serious complications. It was interesting to see that the user interface of Esper was not radically different from the commercial product I worked on (the underlying implementation could be another matter).

Most of my technically oriented friends know I am a Groovy fanboy. For that reason, I sought to expand my knowledge of this dynamic language and attended a couple of sessions dealing with the language. Unfortunately, both talks were a bit basic for my needs. The Grails talk was interesting, but most of the time was spent covering grails, so it was review. Because of time, the integration with Ajax was a bit rushed and not given the detail I had hoped for. The good news is that Scott runs the local JUG, so there is hope he will present the Ajax integration there in the near future (hint, hint). The Groovy testing talk suffered from a similar problem. Too much time covering the basics and not enough time discussing the interesting issues you can solve using GroovyTestCase. In particular, I had hoped to see more about how to test Java with Groovy. This is given a a lot of lip service, but examples are less common. (aside: I have done a little with this, see Exploring Groovy's Optional Typing Features). Instead, two very basic examples of using Groovy to test Groovy were presented. for folks new to Groovy, this was probably helpful. Another case of a talk that needs to be broken into basic and advanced sessions.

Neal's talk on Design Patterns in Dynamic languages, while not specific to Groovy, provided my Groovy fix for the weekend. The take away from this talk is that design patterns really have two purposes. As a mechanism for communication with other developers, they provide a set of common semantics. Neal feels this is a "Good Thing"(tm) and doesn't think this should change. The second purpose, on the other hand, is to provide band-aids to languages that lack features that can directly solve the problem a pattern is targeted toward. Neal argues that in the dynamic language arena, this reason is less relavent. Referring to the Gang of Four book, Neal relates anecdotally that one of the authors once stated that the sub-title of the book could have been "Making C++ suck less." To back this up, Neal went through several patterns, starting with the Command Pattern, that require a lot of scaffolding in traditional languages, but become so trivial in a dynamic language (examples used Groovy and Ruby) that it is almost not obvious you are using the pattern. He also introduces a new pattern, common in dynamic language solutions, call the Type Transmogrifier Pattern, which has a goal of transforming types as needed as part of a fluent interface call, such as a DSL that allows an end user to request services in convenient units (grams versus ounces) and letting the code handle the conversion.

The Mylin talk was another session taken principally to scratch the itch of curiosity. You get Mylin in a standard Eclipse download, but what the heck does it do? From Brian's talk, it appears one excellent use is to provide an Eclipse-based interface into defect tracking tools like Trac and Jira. Talking to another attendee after the talk, it sounds like Mylin can be also be used for task tracking without the need for an external tool to interface with. I find integration with external tools to be the most compelling argument for me. But even that seems a bit weak unless there is organizational use of the tool. Still, a good talk and now I know...

Source code control is important. It is one of those things that every good developer I have worked with feels strongly about. Having worked in systems like ClearCase for many years, CVS (and later, subversion, which is CVS++) felt like a breath of fresh air when I started using it. But even these tools are getting a bit long in the tooth. Git is an SCM tool that is garnering a lot of buzz. Stu's talk was a great intro to the tool, which represents a bit of a paradigm shift to the traditional tools. At its heart, it is really a distributed file system with some SCM tools layered on top. One bit of trivia I had not heard before, git was written by Linus Torvalds. The selling point for git is how easily it supports distributed development, where you do not have access to a central repository all the time, and the ability to do lightweight branching, again without having to phone home first. This is a cool tool and one I intend to start using at home right away.

The Scaling talk was good, if a bit web-centric in nature. My biggest take out of the talk is that scaling is not necessarily equivalent to performance. A system that scales may actually degrade performance as load increases, but it does so in a linear and measurable fashion. Ken stressed the need to do load testing as means to validate that your system behaves in this fashion. Another key take away that seems obvious after you hear it is that any agreed upon SLA must include load along with response time. Otherwise, you are almost guaranteed to violate the SLA at some point. Also discussed was the fact that systems can scale horizontally (add more boxes) or vertically (add more memory, increase processor speed). Given that increasing processor speed these days really means adding more cores rather than increasing the clock speed, I think the distinction is a bit artificial. Given this fact, Ken stressed the need to understand concurrency.

Beyond the technical details presented at NFJS, one of the really cool aspects of the conference is the way the presenters share their vision of the where the industry is going. Two very common memes present this year were concurrency and the JVM as a platform.

As mentioned earlier, improvements in processor speeds is leveling off and multi-core processors are now the de facto mechanism used to increase the performance of a system. But this presents new issues for software. The only way a software product can benefit from the perfromance increase provided by a multi-core system is if it is designed to operate in a concurrent fashion. And worse, if software is not written to operate in a concurrent environment (that is, not thread safe), not only will it not see a performance improvement, but it may fail when run in these environments. In past conferences, there have been some great presentations regarding this. This year, rather than a focus on thread safe Java, presentations were made regarding functional languages that run on the JVM and are inherently thread safe. Scala and Clojure seemed to be the popular languages among the presenters this year. Venkat has a book in the works on Scala and Stu has a book (out in beta) on Clojure. I just picked up the clojure book. Will hopefully blog on it soon (the folks reading this who know me now what soon means). Neal has coined the term "Polyglot Programming" and advocates using the correct language for the job, e.g., Groovy/Grails for a front end, Scala for the back end to provide thread safety, etc. When folks complain about the complexity this adds, he points out that we already do this, since most projects have custom XML property files, Spring property files, Ant (or Maven) build files, Java source code, etc. In other words, developers are already used to this.

The other meme, the JVM as a platform, has been building steam with the presenters for the last three years and really seems to have arrived this year. Stu has a series of presentations he is calling the Java.next series. I attended the Common Ground session, in which he presented four up and coming languages, Groovy, JRuby, Clojure, and Scala, that run on the JVM. He spent most of his time discussing the common features of all of these languages and why any of them represent a better language solution than Java. Rather than relist these features, I just point you to Stu's blog on the topic, here. As an aside, Stu was a huge Ruby advocate last year. This year, he seems to have decided that Groovy is a reasonable language as well, but is really into Clojure right now. He made a comment about being attracted to bright and shiny new objects. I can totally understand that.

A blog post cannot even begin to impart all of the knowledge that is shared at a No Fluff. And beyond the sessions, there are great side conferences, networking, and food (really good food!). One of my favoite meta-activities is to collect entertaining sound bites. In conclusion, here are a few gems I picked up this time...

Never burn a bridge (unless the orcs are behind you)

A compiler tells you your code is well formed, a unit test tells you your code is well behaved

Developers are responsible for keeping their knowledge portfolio up-to-date

Increase your digital footprint, blog

A stack trace means nothing to a human (I laughed my rear off on this one)

Eat like a bird, poop like an elephant (i.e., consume information in large quantities, share information in large quantities)

Agile development means sustainable progress, not necessarily fast progress

Be resourceful, not a resource

Greenspun's 10th Rule applies to Java

Evangelize, be passionate!

Sunday, July 20, 2008

Exploring Groovy's Optional Typing Features

So I was talking to a friend over a beer after the last BJUG. That night it was the Venkat Show. If you have never had the chance to listen to Venkat talk, all I can say is try harder, the guy is amazing. He opened the JUG with a discussion of interesting features of Java that might have escaped a developers notice (or were forgotten along the way). His second talk was on testing with Groovy. As you will soon see, the testing topic turned out to be helpful with this blog.

Note: code for this blog can be downloaded from here.

After the jug, we started discussing Groovy and dynamic programming in general. As folks who know me well can attest, I am a card-carrying, drank the dynamic cool-aid, Groovy fanboy. So when my friend told me of something he learned at a recent NFJS seminar (he probably brought it up just to rub in the fact that he was there and I wasn't, but I digress), I had to re-think my choice of next gen language. Fear not, the story ends well...

Here is the basic challenge he threw down: "Write a Groovy method that takes a statically typed argument of List. Now call this method from another method, supplying an argument of type Stack. It will work just fine, type mismatch errors."

I didn't believe that this would work. I know Groovy is a dynamic language, but due to its close affinity with Java, I had believed that although optional, when type was provided, it would be honored. So rolling up my sleeves, I wrote some code (See README and code in test1 directory).
class GroovyAdder
{
def doit(List list)
{
println "In list version of doit"

def sum = 0

list.each
{
entry ->

sum += entry
}

return sum
}
}

and then a Groovy test for this class (I told you Venkat's talk would be relevant):
class TestGroovyAdder extends GroovyTestCase
{
def adder

void setUp()
{
adder = new GroovyAdder()
}

void tearDown()
{
adder = null
}

void testList()
{
List list = [0,1,2]
assertEquals 3, adder.doit(list)
}

void testStack()
{
Stack stack = new Stack()
stack.push(3)
stack.push(4)
stack.push(5)
assertEquals 12, adder.doit(stack)
}
}

Running this test, I find that the original statement was true, calling the method with a Stack argument works.
.In list version of doit
.In list version of doit

Time: 0.249

OK (2 tests)

I was so confused! This brings us back tangentially to Venkat's first talk. While the problem is not something he addressed, it does fall into the category of "things forgotten". It took me a while, but eventually, I came around to the realization that Stack implements the List interface. Ah, a Stack IS A List, so all is well. To verify this, I added the following test (See README and code in test2 directory):
void testQueue()
{
Queue queue = new PriorityQueue()
queue.add(6)
queue.add(7)
queue.add(8)
shouldFail(MissingMethodException) { adder.doit(queue) }
}

A Queue does not implement List, so if the typing information was honored, this test should fail, and it did! A MissingMethodException exception was thrown. Problem solved, I sent the solution to my friend. He agreed that this solved the initial problem, but also added that it seemed to be a royal pain that I had to run the program to see the problem, when it could be (and is in Java) detected during compilation. He's right if we are talking about a statically typed language, but Groovy is really a dynamic language at its core. The amount of compile time checking it can do and still adhere to the dynamic mantra is limited. Let's explore this a bit (See README and code in test3 directory).

First, lets just add a second method (and a couple of println's) to the Groovy Adder class:
class GroovyAdder
{
def doit(List list)
{
println "In typed version of doit"

def sum = 0

list.each
{
entry ->

sum += entry
}

return sum
}

def doit(def list)
{
println "In def version of doit"

def sum = 0

list.each
{
entry ->

sum += entry
}

return sum
}
}

We also need to change the test that uses Queue (hint, the call will succeed this time):
void testQueue()
{
Queue queue = new PriorityQueue()
queue.add(6)
queue.add(7)
queue.add(8)
assertEquals 21, adder.doit(queue)
}

Running this test, here is the output we get:
.In list version of doit
.In list version of doit
.In def version of doit

Time: 0.226

OK (3 tests)

This is way cool. Groovy was smart enough to choose the typed version of the polymorphic method when it could and the dynamic version when nothing else could be found.

But, of course, this still could be decided at compile time. Let's try one more thing (See README and code in test4 directory). First, change the uninteresting method we just added as follows (it is now a typed method as well):
def doit(Queue list)
{
println "In queue version of doit"

def sum = 0

list.each
{
entry ->

sum += entry
}

return sum
}

So now we have a Groovy class, where the author provided for the use of Lists (which gives us Stack by default) and Queues. Of course, now I want to use a Set. I'm never happy. So let's add the following test and associated closure:
void testSet()
{
def expando = new ExpandoMetaClass(GroovyAdder)
expando.doit = dynamicDoit
adder.metaClass = expando

Set set = new HashSet()
set.add(9)
set.add(10)
set.add(11)
assertEquals 30, adder.doit(set)
}

Closure dynamicDoit = { Set set ->
println "In set version of doit"
def sum = 0

set.each
{
entry ->

sum += entry
}

return sum
}

We have now crossed the boundary and are firmly entrenched in dynamic land. The compiler can't save us here. Prior to testing with the set, the test code alters the metaclass of the object we intend to test, adding a dynamic method that knows how to handle sets (it could just as easily have handled Iterable or even def and be more general, some folks never learn). Here is the output from running the test now:
.In list version of doit
.In list version of doit
.In queue version of doit
.In set version of doit

Time: 0.23

OK (4 tests)

As stated earlier, the compiler can't help when the code can dynamically change after compilation, and further, if the compiler tried, it would hinder the dynamic features of the language.

So there you go! But wait, there is more. As an added bonus, let's look at what happens when we are using Java (See README and code in test5 directory).

First, we create a simple Adder POJO:
import java.util.List;

public class Adder
{
public int doit(List list)
{
int sum = 0;

for (Object entry: list)
{
sum += (Integer)entry;
}

return sum;
}
}

See the java test code on the download site for an example of how you get a compile time error if you attempt to pass a Queue into this doit method of this class. You can't get there from here, well at least not in Java. So let's see what we can do when using this class in Groovy. Here is the Groovy test code:
class TestJavaAdder extends GroovyTestCase
{
def lister = new Adder()

void testList()
{
List list = [0,1,2]
assertEquals 3, lister.doit(list)
}

void testStack()
{
Stack stack = new Stack()
stack.push(3)
stack.push(4)
stack.push(5)
assertEquals 12, lister.doit(stack)
}

void testQueue()
{
Queue list3 = new PriorityQueue()
list3.add(6)
list3.add(7)
list3.add(8)
shouldFail(MissingMethodException) { lister.doit(list3) }
}

void testQueueWithMop()
{
Adder.metaClass.invokeMethod =
{ String name, args ->

println "In Groovy version of Adder.doit"

if ("doit" == name)
{
def sum = 0

args[0].each
{
entry ->

sum += entry
}

return sum
}
}

Queue list3 = new PriorityQueue()
list3.add(6)
list3.add(7)
list3.add(8)
assertEquals 21, lister.doit(list3)
}
}

Running these tests gives:
.In Java version of Adder.doit
.In Java version of Adder.doit
..In Groovy version of Adder.doit

Time: 0.227

OK (4 tests)

The results from TestList and TestStack are not surprising, as they would work the same way in Java. But notice in testQueue we were able to call the statically typed Java doit method, which requires a List, with a Queue. And we did not receive a compile error! We only found our error at runtime.

The reason, again, is that if Groovy strongly enforced the type expectations at compile time, there would be no way to use the dynamic trick demonstrated in testQueueWithMop (MOP = Meta Object Programming), in which we use another Groovy mechanism to dynamically extend the features of a class, this time a Java class, without the need for source code.

The purpose of this blog was primarily to answer the question of how Groovy's optional type mechanism works and provide my understanding of why things work the way they do. In doing so, I made use of some of Groovy's MOP features. To learn more about these, I refer you to the sources I used. Scott Davis' book, Groovy Recipes, has a great intro to Metaprogramming, along with easy to use code snippets that illustrate the topic. Venkat also has a book, Programming Groovy, that delves even deeper into the topic. I highly recommend both books.

I also did not go into the debate of dynamic versus static typing. Folks a lot smarter than me are discussing this issue. I am principally a spectator to the debate. I will say that for DSLs, which I am very interested in, testing, and script use, the dynamicism of Groovy rocks! And I am tending to lean that direction for other code as well, as long as the code is backed by good unit testing.

Hmmm, guess I picked a side. Well, no one has ever accused me of not having an opinion.