Posted by Gigi Sayfan
on September 30, 2015
Open source development has never been more prevalent and successful than it is today. The biggest players regularly publish the latest and greatest technology with permissive licenses. Some companies even do their whole development in public — inviting anyone to fork, download and send pull requests.
Google has always been a strong advocate of open source and even funded a lot of non-Google open source projects through their Summer of Code program. Facebook publishes not just software, but even the specs to their servers.
Microsoft recently joined the party and open-sourced key parts of their technology stack and the company even develops core pieces openly on Gitgub.
The big question is whether or not the agile development style mixes well with open source. On the face of it, they are polar opposites. Agile evolved from tightly knit co-located teams. Open source is all about strangers who have never met collaborating across the globe.
All the same, there are many similarities and shared principles between the two methodologies. Both paradigms put a great deal of emphasis on testing and automation. Both favor small and quick iterations. Obviously, co-located Agile teams can publish the results of their work as open source. The more interesting case is when an agile development team, that is not co-located, can still successfully take advantage of open source methodologies such as rigorous source control policies, pull requests and continuous integration tools.
Posted by Gigi Sayfan
on September 4, 2015
Pair programming is one of the most controversial agile practices and is also the least commonly used in the field, as far as I can tell. I think there are very good reasons this is the case, but perhaps not the reasons everybody thinks about.
Pair programming consists of two programmers sitting side-by-side, working on a given task. One is coding the other is observing, suggesting improvements, noticing mistakes and assisting in any other way, such as looking up documentation.
The benefits are well known. For more information you can download a PDF of The Costs and Benefits of Pair Programming
But, why didn't it take off as so many agile practices that have become mainstream staples? The reason that is often mentioned is that managers don't like seeing two expensive engineers sitting together all day and working on the same code. That may be true for some companies. But, often it's the developers themselves who dislike pair programming.
There are many reasons that some developers dislike pair programming. Many developers are simply loners and prefer to focus on the task at hand and their flow is disrupted by the constant interaction. Many developers like to work unconventional hours or from home/coffee shop and that makes them difficult to pair. The original extreme programming called for a 40 hour work week in which everybody arrived and departed at the same time, but in today's flexible work environment this is not always the case.
I, personally, have never seen full-fledged pair programming practiced and it was never even on the table as a viable alternative. My experience is based on many years of working for various startups that used many other agile practices. I tried to institute pair programming myself in a few companies, but it never caught on.
So, is pair programming a niche practice that can only be used by agile zealots that follow the letter of the law? Not necessarily. There are several situations where pair programming is priceless.
The most common one is debugging. I've used pair debugging countless times. Whenever I get stuck and can't make sense of what's going on I'll invite a fellow developer and together we are usually able to figure out the issue relatively quickly. The act of explaining what's going on (often referred to as "rubber ducking") is sometimes all it takes.
Another typical pair programming scenario is when someone is showing the ropes to a new member of the team. This a quick way to take the newcomer through each and every step involved in completing a set task and showcasing all the frameworks, tools and shortcuts that can be used.
What are your thoughts on pair programming?
Posted by Gigi Sayfan
on August 26, 2015
Decision-making is at the heart of any organized activity and, as such, there are significant associated risks and costs. The higher up you are on the totem pole the more risk, cost and impact is associated with every decision you make. For example, if you are the CTO and decide to switch from private hosting to the cloud, that has enormous ramifications. Obviously, such a switch is not a simple process. It will entail a thorough evaluation, prototype and gradual migration. This is often the reason that many large organizations seem to move at such a glacial pace. But, there are many decisions that can be made and acted upon quickly and yet often take a very long time.
This is often tied to the reporting and approval structure in the organization. The level of delegation and the freedom of underlings to make decisions on their own without approval is often the key factor.
There are many good reasons for managers to require approval: maintain control, ensure that good decisions are being made, stay up to date and informed on higher-level decisions. The flip side is that the more a manager is involved in the decision-making process, he or she has less time to interact and coordinate with other managers and her own superiors, study the competition, think of new directions and many other management activities. This is all understood and every manager eventually finds the right balance.
What many managers miss is the impact on their subordinates. Very often, a delay in decision-making is much more than making a quick bad decision. Let's start from the ideal situation — your employees always make the right decision. In this case, any delay due to the need to ask for approval is a net loss. The more control a manager maintains and the more direct personnel he or she manages, the more loss will be accrued.
But what about the bad decisions that such processes prevent? This is obviously a win in terms of one less bad decision, but the downside is that in the long-run your subordinates will not feel accountable. They'll expect you to be the ultimate filter.
If you're aware of this then the path forward is pretty clear — delegate as much as you feel comfortable (or even more). Let your underlings make mistakes and help them improve over time. Benefit from streamlined productivity and focus on the really critical decisions.
Another important aspect is that not all bad decisions or mistakes are equal. Some mistakes are easily fixed. Decisions that may result in easily reversible mistakes are classic candidates for delegation. If the cost of a bad decision is low, just stay out of the loop.
Posted by Gigi Sayfan
on August 17, 2015
The Agile style of development is considered best for many organizations and projects. There is a lot of criticism too, but overall it gained significant mind share and developers use it all over. The big question is, "What is Agile development?" You could look at the Agile Manifesto. This is a great collection of values and principles. However, the Agile manifesto is not a recipe. If you want to create an Agile process out of it, you have a lot to figure out. Let's take a look at some of the common, or at least well known, Agile processes out there: Extreme Programming, Scrum, Kanban, Lean Software Development and Crystal Clear. They are all quite similar, actually, with different emphasis here and there, but overall there is a sense of a common core.
The other important thing about Agile development in general is that it prefers lightweight methods over precisely prescribed processes (one of the core values). As such, Agile methods are in a difficult place, because they can either stay on the fuzzy/fluffy edge of the spectrum or succumb to the grind and actually recommend specific practices and processes.
I've been practicing Agile development for more than 15 years in the trenches and it's never done by the book. In the field, Agile is often some adaptation of Agile methods, with or without the actual terminology.
Some concepts and practices made it to the mainstream big time, including automated testing, continuous integration, short iterations, refactoring -- others, not so much, or not uniformly.
The most important practice, in my opinion, is short iterations. Why is that? I discovered that if you practice short iterations almost everything else falls into place: You deliver frequently, which means you need to focus on what's most important, which means you need to define done criteria, which means you need automated builds and testing and continuous integration, which means you can refactor with confidence, etc., etc.
A short iteration is unambiguous. If you commit that you will deliver something every two weeks, you're golden. It's easy to keep track of what's going on (only a two week horizon to look at) and it is easy to measure progress and team velocity (you have milestones every two weeks). Of course it's easy to respond to change, because every two weeks you start from scratch. By definition, you can't be more than two weeks into anything.
Posted by Jason Bloomberg
on November 25, 2014
Perhaps you have noticed that my Twitter handle is @theebizwizard. You may have even noticed that my Skype handle is as well. It occurred to me that I’ve never told the story about that handle. I’d say it’s about time.
Ebusiness, as we gray-haired geezers are happy to relate, is a hype term from the dot.com crazy period of the late 1990s. The e prefix stands for electronic, as in e-mail. Well, if we can electronic-ify our mail into email, and we can electronic-ify our commerce into ecommerce, let’s take the next step and do the same thing with our entire business!
The core idea made perfect sense. Even as long ago as the 20th century, many businesses became increasingly dependent on their IT, and in some cases, we could even say that their business was IT. Take banking, for example. Money was no longer cash in a drawer, it was bits on a wire. Everything a bank does touches its technology.
But then two things happened to this essentially good idea. First, the pundits and vendors took the term and hype-ified it, essentially turning ebusiness into dot.com insanity, the enterprise version. Ebusiness rode the wave up to the top and predictably crashed along with the rest of the dot.com nonsense.
The second thing that happened to ebusiness, however, is more important. We finally realized that even ebusinesses aren’t made up entirely of technology. In reality, businesses are made up of people, just as they have been since the dawn of commerce back in the stone ages. Technology only gives us tools – increasingly powerful tools, but tools nevertheless.
What about Wizard?
There’s more to the theebizwizard story, however – the word wizard. This story begins in 1996, when my first wife and I were thinking about buying a mansion in Pittsburgh and turning it into a bed and breakfast. We never ended up buying the mansion, but I did buy the domain name rhodes.com, as a fellow named Rhodes originally owned the mansion.
It may be hard to believe now, but in those days it was bad form to own domain names you weren’t using, so I turned rhodes.com into my personal web site. For my email address I concocted firstname.lastname@example.org – to indicate I could have selected any email address, not to indicate any kind of magical ability on my part.
But to my surprise, in 2006 someone agreed to my price -- $50,000. For a domain name! It seemed the dot.com craziness hadn’t entirely gone away after all. (The buyer put up a tourist site promoting Rhodes, Greece, until they went out of business. To this day www.rhodes.com is still on the market.)
So, to make a long story, well, even longer, when it came time to create a Twitter handle, I put together ebusiness and wizard – not to indicate I was a wizard at ebusiness (although thank you for thinking that!) but rather as a reminder of the two sides of hype. Yes, ebusiness as a term came and went, but the fact I was able to sell a domain name for more than I made in two years as a high school teacher shows the true power of hype if you know how to use it (or if you just get lucky).
Now It’s Digital
In the 2000s, I successfully rode the SOA hype term up and then down. Today I’m doing the same thing with digital transformation. I’m the first to admit this new term – especially the word digital – has its flaws, but it represents a set of very powerful and important ideas nevertheless.
In fact, we’ve come full circle with this story, as digital business more or less means the same thing as ebusiness. Admittedly, today digital technologies include mobile and social media, where ebusiness focused primarily on the web, so the technology story today is noisier and more confusing. But the fundamental principles still remain: in particular, the fact that digital is about people.
Today, the fundamental driving force behind digital transformation is the shifting preferences and behaviors of users – either consumers or employees, who after all, are also consumers. Consumers demand multiple technology touchpoints, and it is for that reason (and only that reason) that digital is about technology at all.
Nevertheless, people are missing this fundamental point, just as they did with ebusiness back in the day. It seems that every day I spot yet another digital consultant or digital analyst hanging out their shingle, promising to help companies with their digital strategies. And what do those strategies consist of? Mobile strategies. Twitter strategies. Even web strategies. The list goes on and on. In other words, technology strategies.
What about customer strategies? Not sexy enough. Businesses have needed customers since the dawn of time, after all. What’s new or exciting about that? Today, people are confused about mobile and social media and don’t even get me started about the Internet of Things. Those are the hot topics! And hot topics are where the money is!
Well, yes and no. Yes, there’s money in hype – that’s the lesson email@example.com taught me, after all. But never forget that people are – and will always be – at the center of business. Even ebusiness, or now, digital business.
Avoid Shiny Things
The reason it’s so easy to miss this fundamental principle is due to what I like to call the shiny things problem. People like shiny things, after all. Why? Because they’re shiny. The shinier the better. And if something is really shiny, you’ll forget all about what problem you’re trying to solve.
Techies frequently fall for shiny things. It seems that every new programming language or open source product is the next shiny thing. Ooh, Haskell! We gotta program in Haskell! Docker! We gotta use Docker! Etc. Etc. Believe it or not, SOA was a shiny thing in its day. I spent half a day in my SOA class trying to convince a roomful of architects that if something ain’t broke, don’t fix it. Don’t do SOA because it’s shiny!
Now digital transformation is the next shiny thing. People want it because, well, because it’s shiny! My advice: don’t fall for the shininess trap. Sometimes you do really need shiny things, and then by all means, go for it. But start with the problem you’re trying to solve. In the case of digital transformation, that starting point always centers on the customer, not the technology.
Posted by Jason Bloomberg
on November 19, 2014
I’ve started using the phrase digital professional, in particular in my recent article dinging Amazon’s cloud division Amazon Web Services (AWS) for not having a clear digital strategy. However, I haven’t been very clear on what I mean, so it’s time to put a finer point on this terminology.
Others were designated as creatives, including graphic designers and the like. A third subset of the web professional community were the marketing people: hammering out web strategies, focusing on key performance indicators like conversions and churn, and figuring out how to communicate to users using this newfangled World Wide Web of ours.
And finally, there were the information architects, designing page flows and form interactions, making sure site maps made sense and navigation worked as expected.
On Beyond the Web
Cut to 2014, and the digital professional sports all these roles and more. Today digital is much more than the web, as it also comprises mobile, social media, and a burgeoning class of newer technology touchpoints that are currently undergoing a phase of rapid, disruptive innovation – everything from iBeacons to thermostats and other consumer-facing parts of the Internet of Things (IoT). We might classify anybody who works in any of these areas as a digital professional.
However, in spite of its technology-centric label, digital is not really about the technology per se. What’s important about digital today is the fact that customer preferences and behavior are driving organizations’ technology choices – most notably in the B2C world, but also in B2B as well.
Digital transformation, therefore, includes more than technology change. The real transformation is organizational, as customers are driving enterprises to change the way they do business in fundamental ways.
Transforming the Role of Marketing
This transformative nature of digital predictably impacts the roles of digital professionals. While the first-generation web was first and foremost a marketing channel, digital goes well beyond marketing – or from another perspective, marketing itself is undergoing a digital transformation.
If we define marketing broadly as the part of an organization responsible for communicating with current and prospective customers, then digital is shifting and expanding the roles of marketing to include data scientists, engineers, and architects of many varieties, as well as operational personnel, both on the business side as well as within IT.
As a result, digital teams tend to be cross-organizational. For innovative enterprises, these teams should be self-organizing and only loosely connected to the management hierarchy of the rest of the organization. Most importantly, digital teams should not contain only techies. They should have a mix of different skillsets from different parts of the organization, where ideally the team self-selects and self-organizes to tackle the task at hand.
Will Everyone Be a Digital Professional?
My definition of the digital professional is necessarily quite inclusive. However, while it might sound like everyone in the organization falls into this category, such an eventuality is unlikely and often undesirable. Certainly, as enterprises ramp up their digital transformation efforts, most of the organization will have a much broader variety of roles and goals than the members of the digital teams.
Even as digital transformation takes hold, I would expect only some organizations to end up essentially becoming all-digital. True, it’s possible some web-scale companies may fall into this extreme case (Netflix and Spotify come to mind as possibilities), but it’s unreasonable to expect every bank or manufacturer or government agency to transform into the next Netflix.
Nevertheless, even for traditional enterprises, digital transformation will spread horizontally across the organization, recasting and re-purposing people as they shift from their traditional roles to becoming digital professionals. You don’t need to be a techie in IT, and you don’t need to be a web guru to qualify. But you do need to focus on the shifting needs and preferences of your customer.
Posted by Jason Bloomberg
on November 13, 2014
OK all you techies, time to go back to English class. The word “data” is the plural of “datum.” If you use either word improperly, I’ll hit your knuckles with a ruler, so pay attention.
Wrong: This data is important.
Right: These data are important.
Wrong: Big data is useful when we use it properly.
Right: Big data are useful when we use them properly.
Wrong: Which piece of data am I looking at?
Right: Which datum am I looking at?
And so on. Now, before you freak out and realize your entire existence has been meaningless up to this point because you never saw a datum in your life, relax. Most language experts admit that correctness follows common usage, and since people commonly use the word “data” as though it were singular, that means it’s OK to do so. So carry on, you wretched English destroyers, you.
Common usage or not, treating “data” as plural is still correct. It’s up to you whether you wish your language to be correct, and presumably, as long as people understand you then it doesn’t matter in many situations. But in other situations, it’s important to be correct – or at least to know what is correct, so that if you break the rules, you do so intentionally.
In my writing, I predictably use the word “data” quite frequently, and I endeavor to use it properly every time. And while correctness is important to me, I’m willing to break rules when I feel like it. After all, the previous sentence began with “and,” now didn’t it? In the case of “data,” however, I stick to the rule book for a particular reason.
Data, you see, are inherently plural. When we have a data set, we have a set of many things, not just one thing. In many cases those data are varied and diverse. Especially in today’s big data world, our data are likely to be quite heterogeneous. Referring to them in the plural, therefore, emphasizes both the diversity and the discreteness of our data.
“Information,” however, is a collective noun. We cannot count our information the way we can count our data. We never say “informations” – and for good reason. Information is fluid. It’s difficult to quantify, unless we break it down into data first. And most importantly, information depends upon the recipient: data only become information if there is at least potentially a person on the receiving end that can understand it. Otherwise they’re just noise.
Each datum can be thought of as made up of individual bits or bytes, concrete units that we can count, move, and calculate with. Information, in contrast, must inform – an essential abstraction of the data that brings humans into the loop. Emphasizing this distinction is why I always treat “data” as plural. Break this rule if you wish, but remember, I still have my ruler.
Posted by Jason Bloomberg
on November 6, 2014
I attended a Business Architecture/Enterprise Architecture power panel yesterday at the Building Business Capability conference in Miami. Since I knew two of the panelists personally, I expected a lively conversation—and in that respect I wasn't disappointed.
However, there was one word that nobody mentioned the entire session, neither panelists nor audience members: agility.
From my perspective, agility is—or at least, should be—the primary driver for Enterprise Architecture (EA), as well as Business Architecture, for that matter. But no one seemed to be on the same page.
True, there was a discussion of change, and John Zachman did state that the primary reason to do EA is to help organizations change. But no one suggested that EA should help organizations become better at dealing with change. And therein lies the critical distinction.
EA (as well as Business Architecture) have long clung to the "final state" myth. If we can define a final state and help the organization get there, then we can consider ourselves successful. By then, of course, the desired state will have changed, so we pick up our skirts and rush to the new destination, bouncing from one purported final state to another, as though we were trying to pounce on some kind of enterprise leprechaun moving his pot of gold.
It's time to stop the madness! Jumping from one illusory goal to another doesn't serve our organizations well. Instead, let's raise our game. Focus on embracing change. Live it. Breathe it. That's what business agility is all about.
And for what it's worth, nobody mentioned the word innovation, either. Why am I not surprised?
Posted by Jason Bloomberg
on October 30, 2014
The latest cyberattack to hit the news is POODLE (Padding Oracle on Downgraded Legacy Encryption). While POODLE wins points for both the cutest title and including Oracle in its name, I’m cheering it on for a different reason.
POODLE compromises the obsolete protocol SSL 3.0. That alone wouldn’t be a big deal, but it’s sneakier than that, since it tricks browsers and other applications that use more recent, more secure transport-layer security protocols to downgrade to SSL 3.0, thus becoming vulnerable to attack.
The best way to protect yourself from POODLE is to disable SSL 3.0 across your entire IT environment – servers, browsers, the lot.
And that’s why I’m cheering. You see, the bane of many an IT manager’s and web developer’s existence is Internet Explorer Version 6. This browser version has been obsolete for years, but numerous enterprises still insist on remaining standardized on it. It doesn’t support HTML 5, which is one of the many reasons web developers hate it. But that deficiency alone hasn’t forced shops to switch browsers.
The good news, however, is that IE 6 doesn’t support any transport-layer security protocol newer than SSL 3.0. So not only can POODLE drive a big truck through IE 6’s security defenses, but now every IT shop must disable all SSL 3.0 support to protect the rest of its applications. And that means finally getting rid of IE 6 once and for all.
Posted by Jason Bloomberg
on October 24, 2014
As I built my first Web site – and wrote my first article about the Web – back in 1995, I have the privilege of counting myself among the senior citizens of the Internet. So for all you young pups out there, gather around and let grandpa tell you a scary story, just in time for Halloween.
Once upon a time, there was a speculative bubble, as more and more people saw the stocks of fledgling dot.coms go up and up, in spite of the fact that many of them hadn’t turned a profit, and in fact, several didn’t really have any viable plans to do so. But we all started believing our own hype, and the VCs and other investors piled on, and companies who rushed to go public saw insane ramp-ups of their stock prices, mostly on the relatively new NASDAQ market.
Until, of course, the speculative bubble burst, and the whole shebang came crashing down, taking with it the fortunes of many established technology players as well as the telcos, leading to what I snarkily like to call “Bush Recession #1” around 2001.
But while dot.com darlings like Kozmo, Boo.com, and TheGlobe.com are relegated to the history books, others like Amazon.com, Google, eBay, and Yahoo are still with us. It took a few years, but the technology and telco sectors are now going full steam, in spite of the financial crisis and – you guessed it – the 2007 – 2009 “Bush Recession #2” (two recessions, same Bush). But while the financial crisis hit banking and real estate, it only presented a speed bump on the road to today’s insane technology run up, as the all-time NASDAQ chart below will attest.
That ominous spike around 2000 was the dot.com bubble, of course. The little dip around 2008? Well, that was Bush Recession #2, aka the financial crisis.
Sure Signs of a Bubble
As we complete our scary Halloween story, the question now is, do we live happily ever after? Direct your attention if you will to what’s happened with the NASDAQ since the last recession. A rather sharp run up, wouldn’t you say? Now my crystal ball is no clearer than anyone else’s, but my bet is that we’re in the midst of yet another speculative bubble – and all bubbles pop sooner or later.
I’m no financial analyst, but I do follow the technology marketplace, and I have the perspective of someone who lived through the dot.com bubble. The reason I make such a dire prediction isn’t primarily based on the chart above, but rather the following similarities between what’s going on today and the period during the dot.com run up.
Crazy Money: Acquisitions. In “normal” times, a small tech company with a few dozen people and no profits might sell for a few million bucks. Well, you don’t have to be an avid reader of the financial press to know that there have been a series of such acquisitions, only in the billions of dollars. Facebook picked up WhatsApp for a cool $19 billion. Google buying Nest Labs for $3.2 billion. Facebook again using its clout to buy Instagram for the bargain basement price of only a billion – to name a few.
These transactions in and of themselves aren’t necessarily the primary indicator of a speculative bubble. Rather, it’s the effect they have on other small tech companies and the people who work for them – or who might want to join such a company. When people start or work for companies because they see them as lottery tickets with billion dollar jackpots, rather than opportunities to make some money solving problems for customers, you have a huge “party like it’s 1999” red flag on your hands.
Crazy Money: Investments. Remember Zefer? No? Well, listen to Grandpa again. Zefer was one of a group of dot.com consulting darlings we liked to call iBuilders (I worked for a time at another iBuilder, USWeb/CKS, which became spectacular dot.com flameout marchFIRST, but I digress.) Zefer made history back in 1999 when they snagged a $100 million VC investment – unheard of at the time for a professional services firm.
Today, $100 million is chump change. So far this year alone, we’ve seen VC investments of $1.2 billion for Uber. $325 million for Dropbox. $250 million for Lyft. $200 million for AirBnB. $160 million for Pinterest. $160 million for Cloudera. $158 million for Box. And too many deals in the $100 million range to list (numbers from here). And those are just some 2014 deals – many of these companies had received many tens or hundreds of millions in earlier rounds.
And just what are these companies supposed to do with all this green? Grow. Big. And fast. The VCs are looking for “multiple baggers” – a simple 10% or 20% return on investment isn’t good enough for this group of one percenters, oh no. They want to multiply their investments many times over. 1000%. 2000%. Those are the returns they’re really hoping for.
When investors put $10 million into a company hoping to get $100 million out that’s one thing. But just what will that NASDAQ chart have to look like for the investments like the ones above to pay off? Might as well invest in tulips.
Hype about Hype. Hype – which I might define as overblown rhetoric touting products or services with a limited current value proposition – is a common phenomenon at all times, and doesn’t necessarily indicate a speculative bubble. But when we start seeing hype about the hype, that’s when my alarm bells go off. Case in point: when The Motley Fool investing site publishes an article entitled Believe the Hype: The Internet of Things Is No Gimmick, then in my opinion, it’s time to sell all your stock in the market in question, which in this case is the ridiculously overhyped Internet of Things.
How to Mitigate the Fallout
The great thing about playing musical chairs is we all have a seat until the music stops. So get while the getting is good to be sure. Also remember that it’s anybody’s guess whether a correction in the technology marketplace (or the broader digital marketplace, as the Ubers and AirBnBs of the world aren’t really technology plays) will lead to a broader market recession. After all, unemployment in the US has been going down steadily for years (yes, the “Obama Recovery,” naturally), and the Federal Reserve has yet to even start raising interest rates to cool inflation, both signs that we have a good while to go before the broader market cools.
In the meantime, my advice is the same advice I’d give any business at any time – only during speculative run ups, this advice becomes even more important: focus on the fundamentals. Businesses exist to serve their customers. Customers pay for products and services they want or need, and companies make money by providing them at prices customers are willing to pay. So simple, and yet so easy to forget during times of insanity. Keep your eye on your customer and you’ll do OK – even if everyone else is partying like it’s 1999.