2013-12-10

Life Lessons

As I sit here with my coffee this morning I contemplate life lessons learned over the last 6 months. Primarily, I'm thinking about trust; but I'm also thinking about the method whereby we self-induce our experiences.

Trust is the principle focus of my thoughts today, though, specifically how perception can alter our awareness of relationships. If you take any two people who have a trust deficit you need to consider the views of both to reach an appreciation of truth. After all, nothing is black or white in this world. The sole exception of this is when trust is broken by intent, of course, since this represents a decision by one participant to use deception as a means to some end. Once that happens, they have abandoned their view as truthful. Any truth they had is washed away because they are choosing to abandon their integrity, and integrity of viewpoint is a key factor in associating it with truth. 

In the Star Wars films there is a line, in The Empire Strikes Back, where Obi-wan Kenobi remarks that his lies are true from a certain point of view. He is commenting upon his view that Anakin Skywalker was killed by Darth Vader. His view is that Anakin's acceptance of the title alone was an expression of choice, and therefore essentially an abandonment of the righteous path, and a form of spiritual death. Unlike Master Yoda, whose convictions about matters are, in Episodes 1 to 3, founded on dogma, Obi-wan is actually striding along a far more subtle path, and his weariness is a recognition of the price of holding a stringent view. When one harkens back to the fight at the end of Episode 3, the issue is further clarified, because there is another critical formative line. Obi-wan says at one point, just before Anakin's choice to finally attack, that he has the high ground. He is not only speaking tactically there, but from a spiritual perspective he is commenting upon the fact, without dismissing Anakin's views categorically, that there is a purity in his path that his brother lacks. This purity is why Obi-wan manages to sustain his view that purity of truth is less important than purity of intent.

In real life, obviously, there is never such a linear proof of the idea of truth being mutable. The "he-said-she-said" syndrome comes about precisely because of this, and it deeply affects all future communications between parties who suffer moments of mistrust. That said, the last 6 months of my life have pivoted around the ideas of trust, trust misplaced, and viewpoints.

About 13 years ago I became attached to a development project that involved 2 parties, neither of who, behaved entirely respectably. When they came to a falling out point, I had to make a choice whose products my effort was producing. I had to decide who owned the idea. Following logic, I decided that IP had to accrue to the people who seeded the idea, even if their original seed was pale by comparison to end results. This decision cost me financially at the time, leaving me holding a massive related debt, and drove me to sustain the development effort when, in lieu of being able to pay necessary rates, I was offered shares. To shorten the path, in retrospect I mistook my options then. I ought to have walked away, taken the loss, and attenuated my pain, because from the point I chose the other path I was essentially doomed to end that journey as a castaway. Again in retrospect, all the signs of manipulation were present, but despite my own con-man tendencies, I decided trust was fundamental to the process, and even when I mistrusted actions I didn't mistrust intent. That was probably the worst single decision I ever made, because I suspect I was used from the outset and thrown away when the risks of exposure of that use outweighed the value. That said, I would wager the principals of the company would hold I was the problem child, though it doesn't explain that they eliminated my share position by way of bankruptcy, before they did some deal with a company that had been courting them for months. In the end actions speak volumes, and the fact they are using the product I built under the guise of a new company is complimentary, I suppose.

The point I'm aiming at, though, is not a sour grapes moment. It is an observation about basic trust contracts, and how they are valueless if either party betrays the terms. Revisionist viewpoints cannot overcome intent, because the intent to gain is inherently selfish. There is no truth in business, because trust is sacrificial. It is the mechanical way of business. Contacts are, as they say, rather cheap. 

Truth cannot exist without trust, because only trust imparts a foundation for truthful exchanges. And trust that is one sided is a suicidal behaviour. I know in my case it has destroyed me financially, and is probably going to land my family on the streets, not because I can't work onward so much as because I'm still faced with having committed fully to a cause for 13 years, and I left with nothing to show for it. I have the pleasure to watch my efforts enrich others, without even a Merry Christmas. And yet I feel less bitter than weary, because the real violation happened so long ago it is not in the forefront of my mind. I caused my current distress by misplacing my trust early, and was repaid in full and then some by folks whose trust was never really founded. 

The life lesson I take from this is not that one should never trust. To the contrary, one has to. The lesson is more that we need to be conscious of the warning bells. My stress led to a mild heart attack (a pair of them 2 weeks apart), and my life is worth more than any creation my work generates. And to dwell upon this sort of thing is less important than to rise up and move on. No matter what happens in life, one can cling to Obi-wan's statement, with the caveat that to sustain truth you need to have a foundation of trust. Faith, I suppose, is an adequate word. Faith in self, if not in some higher power.

Ultimately the choices we make dish us, and we make those same choices today, even in moments of bleak struggle. To allow past mistakes to force one to repeat mistakes is counterproductive. And, really, as I can attest you can only run from your bad choices so long. I expect that is true for everyone. 

2013-11-27

What Comes From The Mind...

Over the last week or so I've been thinking a lot about what comes from the mind, about trust, and about the human condition. My conclusions are both amusing and disturbing to me, but oddly not so much surprising.

As much as folks pay lip service to the value of ideas, honest assessment of the value of ideas is not really a human approach. If an idea excites it is held up as vital, and if it bores it is degraded as being tired; but if you historically analyze ideas you often find the most mundane ones are the ones that have the most lasting impact on the world. What comes from the mind then is valued lightly, and that is probably why so many people will co-opt ideas without ever honouring their source. We all do it, to some extent, and it is where commonplace idea-theft, like we see with plagiarism, arises. The fact is this idea is probably one that has been expressed before, so the only reason I can even partially claim it is that I don't know it has been expressed. And that brings me to the perception factor....

A large part of what we view as our reality is founded in selfish perception. It is how we survive, ultimately, because if we fail to internalize like that I suspect we would be driven mad. Can you imagine, for example, that on a Sunday afternoon you have an amazing idea and before you transcribe it you spend all day searching the Internet to assure it is unique? People would never get a damned thing done! More to the point, it destroys the basic principle of development of ideas into expressions of new ideas. Every idea that exists now is, in truth, a homage to whomever walked a similar path before.

When you express something from your mind you are bound by a fundamental expectation of social trust. You have to trust, for example, that if your idea is truly unique then those who hear it will at least have the sense of honour to try to reference your contributions. This attribution is a basic tenant of social intercourse. It is how we judge our interactions, after all. And dishonesty in this sort of basic interaction, a violation of the fundamental expectation of social trust, is a truer form of dishonesty than most, because it cripples the openness of communication.

Modern businesses based upon Intellectual Property (IP) are often founded on disputed ideas, and while it is occasionally because people genuinely don't know they stood upon the shoulders of giants, it is more often apparent that the theft of ideas is the source of those business aspirations. And this happens because even when ideas have inherent value, people convince themselves that the ideas are not as practically valuable as their implementation. And while this was true in the past of such things as hammers (how you use it creates value beyond the actual physical hammer), it is clearly not the case with software, works of art, or other virtual expressions of ideas.

I think there is something in the human condition, in the structure of our minds, that makes violating social trust easier when you can convince yourself an idea is without value inherently. And yet, the irony is that few people violate social trust when they have no measure of gain. What that says about people is probably not kind, but also not universal. There are, after all, altruists who are honest, just as there are people who claim such a title without acting in accordance of the claims.

I wish that we had the courage as a species to become what Gene Roddenberry imagined for his Star Trek universe, a species who values all contributions, however small, and respects that all living things have claim to some talent. If we could value ideas, and those who express them, and those who expressed what came before, then I believe we would be a stronger species. Maybe in daylight I will even believe it possible that we are.

2013-10-05

Windows 8: The Good, The Bad, The Truly Ugly...

It's been a while that I've been using Windows 8, and thanks to Classic Shell I don't often have to look at the funky, chunky, ugly new start menu. Overall, in the last year (give or take a bit), I have formed some impressions about the O/S I feel like plunking down.

First, to defenders of the new look and feel...you're wrong. It isn't that the new UI elements are all bad, but they were not designed with desktop computing in mind. They were designed for some specific form factors (tablets; maybe phones, though I doubt some of those choices), and all else was damned. I don't know whether Windows 8.1 will rectify these issues, but the problems with the new UI are ingrained in whatever principle of design was applied. When you dumb down interfaces, you ensure the reduction of utility of devices over time. Sure, grandma can see the big button, and my big thumb can hit it reliably on a phone, but it is a choice that eschews improvement of the user over meeting the lowest common denominator. And while that might seem to some to equate with "making it easier" the long-term impact is the opposite. And I have an example to prove this point: the Settings charm and the Control Panel. Yes, Control Panel is daunting, but it has power -- and the whole point of accessing the 'control' elements of a system is to control it. Half-control is not the purpose. Yet the PC Settings charm is under-powered to a degree that, if you really need control, you have to go back to the Control Panel anyhow. So, all this new UI element did was fracture the points of access to the underlying control interfaces. Some is available in one location, most is available in the other. Now whether that is eventually rectified isn't the point. The point is that a proper UI redesign would not have caused further access fracturing. It's foolishness to suggest otherwise. And that fracturing means that when a user needs to really access the guts of the PC settings, they still need to step to the Control Panel, making the uncertain user even less capable over time. It makes no sense unless the purpose of the new UI is to actually create this sort of confusion, and reduce user knowledge requirements permanently. And that would be fine if we were talking about limited-value devices, like purpose-focused tablets, but given the incredible reliance of business processes on IT, reducing the utility of the devices is reducing the efficiency of those processes. A new UI for the sake of itself is not proper design.

Second, at an O/S feature level...Windows 8 is an incremental improvement. Network file copying is, for example, much more friendly from a feedback perspective. BUT...the new UI does get in the way of those improvements at times, and, frankly, I have never managed to really get networking on a Windows 8 box in a mixed network client environment to behave. The fact almost all of the quirks are UI-related is telling, and sadly obscures some really interesting tweaks beneath all the presentation junk. And this somewhat renders the feature-improvements moot. What is the point of a better O/S if the better parts are defeated by the surface? (No pun intended.)

Third, and this is a concern I didn't expect to be expressing about an O/S that is essentially a fractional version above Windows 7: there is something inherently unstable in this O/S. I bought a laptop made for Windows 8, touch-screen and all (a waste, that), and in the last year I have noted strange problems that seem integrally bound to updates. Every time there is an update, for example, one or more aspects of the network rigging seems to be altered arbitrarily. (The latest was the addition of a Homegroup icon on the desktop, which seems to be impossible to remove.) Five times, after an "update Tuesday", a network that has functioned prior to the update just fell apart in terms of shares and file sharing. In two cases the configuration actually was altered by the updates, and in three others I have never discovered the cause of the problems. This is just one of the more obvious issues, and not even as aggravating as a few. (The most singularly odd and annoying aspect is that the file system is non-responsive to certain events. Moving a folder or deleting a file ought to remove it from the UI, without requiring the user to hit F5 to refresh -- but when they do hit F5...it ought to refresh instantly. Granted, this oddity has existed for years on certain systems, but that it still remains on Windows 8 is just unforgivable laziness.)

So after about a year of using Windows 8, on the cusp of a new 8.1 rendition, what is the sum positive impact of Windows 8? Nothing, I suspect. It isn't that it lacks for improvements, but that none of them are significant in context of its oddities, its incomplete design, or its farcical lack of address to standing problems from previous versions of Windows. MS needs to either focus on the fundamentals soon, or the problems that they have failed to address for years will ultimately undermine their market share...because as other options improve incrementally, stagnation will doom them.

2013-09-23

The Underemployed Programmer

I really ought to give up the field and become a maintenance engineer! This thought keeps rolling in my head, as once again I find myself playing the role of the underemployed programmer. Yes, there are jobs to do, but it appears that I passed my best before mark, and so I am once more underemployed. I work, I occasionally get paid, and I struggle to apply what 26+ years of experience has taught me, because no one seems to really care what 26+ years can teach a person!

Enough whining. Onto a serious thought or two about experience, coding, and work.

There is some truth to the idea that you don't have to have a lot of experience to work in the programming field. It is about being creative, more than being technical, and the best minds are a combination of those things that tends to lean toward creative. But, what is often lost in this view, correct to some extent though it is, is the fundamental truth about coding. Experience doesn't make a better coder, but it does make a better coder better. Just as a mediocre coder will improve with experience, so too does the talented one...and that is worth something. The field is about lifelong learning, and people in it tend to age well. What you lose in raw energy you more than make up for in stamina and awareness.

Experience is how a project that could cost 30 grand ends up costing 15, or how 10 months of work gets done in 5. And experience is what turns work into productive opportunity. Yes, younger coders tend to pile on hours, but seasoned coders are the ones who pile on value return.

None of that is a shot at the inexperienced coder, either. They are the experienced coders of the future.

What is sad though is that the work out there is largely coming in two categories these days: grunt labour and fantasy fishing, as I like to call them. Grunt labour is where the heavy lifting gets done, and it isn't sexy. It's the kind of products I spent a lifetime delivering. Fantasy fishing is where the pretty iPhone app of the week comes from, where the quality of the code is less important than its positioning. Fantasy fishing is the kind of work that is sexy, and is fun. Sadly, very little falls in the middle, and there is almost no real innovation in business process tooling.

It's a shame that so many businesses today are so focused on legacy maintenance that they forget the future, and sad that investments go to the fantasy fishing realm before they ever even look at the grunt marketplace. And it's a shame so many programmers with so much experience are so chronically underemployed, when the experience and knowledge they possess could lead to the kind of lasting productive innovation the economy so desperately needs.

2013-09-17

Amazon's Kindle Self-Publishing

Well, I just finished my first self-publishing exercise with Amazon's Kindle Self-Publishing platform, and...gods, they need to invest in their process!

Given the cut they take, you would think a few dollars spent making the process less clumsy would be worthwhile, but maybe not. Still, as a very technically capable person, I have to wonder how such a clumsy processing model ever got approved by such a large company. This is a company with worldwide presence and billions (and sad monopolistic tendencies), and their publishing process looks a lot like what you would expect from an outfit with a cash reserve of seventeen dollars and a pizza fetish. Granted, the Smashwords process is not perfect either, but by comparison the Amazon process is just painful. The layout is awkward, the process stop-points are unclear, and at least on every browser I tried you actually can't see the text regions you need to type into until they are selected -- and sometimes even not then -- meaning even when you know you need to type in them, finding them is wickedly difficult.

What really irked me though wasn't even so much the mechanics of the process (please, Amazon, drop the mobi format and just get with the standard epub one!), but the in-your-face way they express their monopolistic flair.

Perhaps the most egregious example is that unless you are willing to halve your potential royalty, you can't opt out of the lending program. Now, I have no issue myself with folks lending my books, because I know that no amount of hand-wringing will stop the eager pirates of the world; and I have no illusions that I will ever make enough money writing to have it cut into my bottom line. (I also think I write well enough that those who enjoy the work will probably pay if they can, because they might want more.) But by Amazon enforcing lending, or poverty, they effectively take either half the meagre income of the authors, or they basically give away a free copy with every copy sold. Yes, I know their arguments and that the limited lending period is 14 days, but seriously, folks...who needs 14 days to read a book that is worth reading? And if you arrange a second copy issued for every one sold, you effectively halved the royalty anyhow. So why offer 70% when you have no intention of paying it? But more to the point, opt-in would be ethical...opt-out never is. Only monopolists practice reverse incentive schemes.

Of course, I might eat these words if someone buys the book in record numbers, but I doubt that and I suspect it will be a while before I dip into the Kindle world again, because on top of all else the process is not friendly. Even the most basic parts of it require more technical effort that most real authors I have known can muster. It then becomes an exclusive process, whereby the published are the technically adept rather than the purely creative. That is not a trend to foster, because its exclusivity harms broader culture. And though the central rot setting into our age is probably irreversible, it still is not helpful to have corporations enhancing the decline by creating pockets of exclusivity where none need exist.

What finally irked me was also their "quality standards." I probably met them easily, because I write stories and formatting is basic, and I do understand their desire to have high quality presentation...but what hypocrisy! Their own interface to let author's upload is so horribly clumsy that you wonder if it was written by grade school interns! If you want to impose "quality standards" and not look idiotic, you need to impose them on yourself first. Seriously, Amazon, review the look, feel and functionality of the platform. Then your quality standards might not smack of parochial ignorance.

2013-03-15

Considering Windows 8

Since Windows 8 came out I've been using the release version on a brand new Acer laptop. The laptop itself is a reasonable one, more than up to the job of day-to-day use, with a touch-screen specifically intended to make for easier use of Windows 8. Beyond that, nothing about the machine is Windows 8 specific in any way, but I can also say that nothing about it is too anemic to run the O/S well. In short, I would have no complaints about recommending a similar or identical configuration to anyone using the new O/S. And rather than list those specs here, I'll qualify that what I'm really saying is that the following commentary isn't affected by any hardware-specific quirks that prevent the O/S from doing what it intents or desires.

Would I recommend Windows 8?

In short, no, even though it does have some useful O/S-level extensions that are improvements. The file copy visualization alone is handy, as are some of the fine tweaks to the file explorer and other applets that come as a standard part of MS O/S setups. But taken together there is not one compelling reason to recommend Windows 8, and there is one overriding reason not to recommend the O/S as an upgrade. (Having said that, it comes on new machines, and frankly there isn't much reason to avoid it either in those situations.)

The overriding reason to avoid Windows 8 is its "new" menu system (formerly called Metro). It is simply not a valid UI for a desktop machine, regardless of whether you have a touch screen available. (That is why I noted its presence above; it really isn't a compelling extra.) Besides it looking like a unicorn has vomited on the monitor (the colour schemes are awful), it is chunky and inefficient on a desktop. Ultimately, it looks like a candy-box cover, I suppose, and maybe it does work on tablets, but it simply isn't effective in a workplace machine environment. Is it a great production drain? No. But does it add anything at all? No. And yet it is the main reason to avoid the O/S for two reasons that are fundamentally unavoidable conclusions if you use the O/S for any period of time to do actual work.

The first reason is that as UI conventions go, on a laptop (or desktop) monitor the chunky and flat nature of the interface means a significant amount of scrolling. Yes, you can trim the iconography, choose small instead of large panels, and so on...but ultimately a menu is used to launch tools, and the absence of multilevel grouping is problematic. In the old Start Menu (or Orb, or whatever the damn thing was called) you could create sub-folders to hold ancillary programs, so that the main menu exposed your primary applications and you could drill down for the few occasions you needed a meatier interface. In this new model you have one left-to-right grouping pattern, or otherwise have to use the search UI to find what you want, or have to put up with an inordinate slew of visually boring iconography. Try installing MS Office 2012 on a Windows 8 box and you instantly see the problem, where every installed tool suddenly has the same level-precedence as the primary applications. And, really, the Microsoft Clip Organizer is not sensibly left on the main menu beside Word. It just doesn't make sense. But the menu system is flat, so there is no drill-down -- you either have it visible or hidden. And while for the expert user hidden is an obvious choice, inexpert users are not going to find that feature, and if they do they are not going to remember to search to find their applications once they are accidentally demoted from the main scrolling menu system. It's simply a bad design taken to ridiculous prominence, and renders a working machine clumsy. Yes, it probably does work with a web-only style box, but frankly forcing the desktop user to suffer it is an unnecessary step.

Probably the larger reason for this being somewhat a show-stopper though is about extended user experience. On a machine intended to work, the Windows Store is a poor experience in terms of depth of real utility. Sure, there are games and the like, but actual utility applications (things you do real, lasting work with) are few, and those that do exist almost always end up with the same launch experience: click the icon, and be punted to the old desktop minus the "start" button element of the UI. And that begs a question that displays the degree of almost arrogant foolishness of the UI designers who let this happen -- why give prominence to a UI convention like that new menu when the end result of clicking most icons in it is to pitch you to a familiar working desktop? It makes no sense whatsoever to layer on a shell that is functionally impaired to that degree. (The same could have been said for all experiences inside Windows, before Windows 95, but the difference is this clumsiness is not a paradigm-shift requirement.) So many better methods to engage the flat-model UI could have been chosen, while recognizing the transitional necessity of the desktop. For example, in the simplest model, reduce the start button to a flat panel button with the windows logo, which when clicked expanded the menu system. Keep the bar, folder and menu model in the left as a side-bar style, and expand the right pane region to show the flat iconography. Based the decision on where to display the icons entirely upon some new code detection scheme. That would have allowed the desktop world to coexist, reduced the transitional stuttering, and maintained the preeminence of the new menu model by giving it a visual showcase without interfering with functionality.

The problem with Windows 8 is small and large because of a fundamentally mistaken assumption about the need for change, and the nature of change. Yes, a new UI convention for menu management might be wise (ribbons are actually not a great trial once you get used to them, for example), but there is no need to break with the past when the transition is going to take time. People would accept that old programs remain in their menu model, and new ones absorb a new model. But by forcing away the menu model (Classic Shell can return it), MS created an experience where users doing real work on a machine will become aware that their required applications do not play well in that new menu; they are constantly shunted to a visually crippled desktop, and my guess is they will quickly gravitate away from new-shell friendly applications when they begin to find this experience jarring. That could have been avoided with almost no effort, but instead the path of "we know best" was chosen, only to rapidly prove otherwise when every useful application punts the user to the old desktop UI.

Would I recommend Windows 8?

No, because in itself it gives very little of immediate value to a user who is on Windows 7, or even Windows Vista. Upgrades are unnecessary.

But nor would I say flee from it, because if your new machine comes with it, with a few clicks (see Classic Shell) you can recover your old interface, and never actually have to see the "new" look again. That isn't much of a recommendation of a UI or Windows 8, of course, and that is a sad fact indeed.