(I sometimes write little fragments of writing like this. It's not really intended for anything other than a way for me to relax and practice writing. I usually write in a single sitting and then review for obvious grammatical or content issues. My thanks to my friend Terry, whose beautiful orchard was the inspiration for this little snippet.)
It was one of those spring evenings where you could smell the crocuses and the earthworms through warm and partly sunny rain showers failing aimlessly on the newly verdant hillside. The moss had fought a valiant fight this year to claim the yard, but as the days lengthened, the sun drenched the area in an unusually bright March, and the grass was victorious, ripping through the moss until in a period of only a few days, it was hard to remember the conflict that had engulfed the turf only weeks before. The frogs were confused as the weather had been too cold, then too warm, and was just now right. Despite the early hour, some frogs through the neighborhood were croaking in their hollow, wooden sounding rattle.
It was the time of year for baseball, for running in the grass, for daffodils picked by curious children and brought in to wither on a table. It was the time when there is some hope that winter will not last forever, the rain will eventually stop, and that the warm embrace of summer was rapidly approaching. It was a time for opening windows, and shaking out blankets, and deciding to roll a new coat of paint to cover the stains and dings from damage over winter.
It was into this perfect paradise of the Pacific Northwest, this veritable garden of Eden, that Jack strode through that evening. He wore, comfortable denim with an unbuttoned flannel shirt that really looked more like fall than spring. He was out inspecting the orchard.
The orchard was not a professional venture. It was not planted or maintained for the purpose of financial benefit, but rather for the purpose of enjoyment, and leisure. An orchard can be hard work, but Jack worked the orchard because he enjoyed both the labor that he applied and the reward that he gleaned.
This lovely spring evening, it was still early. The plums had been blooming for almost 2 weeks now, but most of the other blossoms were merely peeking out from green, sticky, soft unfolding buds. Some of the apples would come much later. Some of the plants were healthier than others, and Jack stopped briefly to inspect a drooping 6 year old apple tree that was oozing sap. It might be possible to heal, but it seemed to make more sense to replace. Jack might be sentimental with his friends and family, but he was a little bit more matter-of-fact with his plants.
But despite seeming to ignore the helpless plight of the weeping young apple tree, Jack loved this orchard. It was a place of refuge. A place to till and to harvest. A place that he usually worked alone. A place that he shared with others, but only when it was ready. He was generous with his yearly crop. Certainly, he used a lot of it himself between the raw fruits themselves, and the jams, preserves, pies, wines, and ciders that he made. But there was always plenty left for others and there always seemed to be some that went to waste.
And that's what orchards provided. Solitude and fellowship, work and leisure, pruning and harvesting.
Way up in the hinterlands, miles from any other humans, two men faced each other on a rocky ridge covered in lava flows and avalanche lilies.
“It was clearly a suboptimal algorithm!” the taller man yelled, with his arms outraised.
“Everyone knows that premature optimization is the root of all evil” the short, stouter man parried. “If we had known all the details of the operational environment at the time that we had implemented the original code, we might have made different decisions, but ultimately, it doesn’t matter. The code worked and the solution was a good one -- at least for the first iteration.”
“The first iteration?! What is with you and iterations of software releases? Why not spend the time to do it right on the first try. Now your code is out there and it’s ridiculously slow -- the whole company looks bad when you deliver crap like this.”
The short man leaned against a large basalt pillar that rose almost unnaturally out of the mountain, like an ancient stella, erected to point out the glory of this lofty pinnacle. “First, make sure it works, then make it fast.” he said slowly. “Let me ask you this, do you think that the company would have looked better or worse if we had spent an extra 2 or 3 weeks on the initial release to make it faster but the whole thing had a fundamental logic error?”
The tall man writhed, “Why do you assume that we would have introduced logic errors? The code was pretty straightforward, the algorithm wasn’t that unique or novel. It was probably new in some sense, but really just a collection of existing algos that were assembled in a new way. What makes you think it was such a hard problem to begin with?”
“Because humans are imperfect. I’m not even saying that there aren’t any logic errors in the first iteration -- err, the initial release, or whatever you want to call what I just wrapped up. What I am saying is that we’re building from a more structured base. We have good confidence that it works now due to the unit tests that we put in place. Sure, it’s slow, but we can always work to improve that speed; then we can regress over the unit tests to ensure that the underlying foundation is solid.”
The taller fellow shifted his feet on the hillside, lava rock crunching like broken glass under his boots. “I think those unit tests are a waste of time. Out of the two weeks you spent building the solution, you spent almost a full week on those. Some testing is great and all, but come on! You spent thousands of dollars on code our customer will never see!” He shifted again and this time a clatter of jagged rocks slid down the hillside and off the edge of a cliff about 30 feet below.
“Our customer doesn’t see the code for the unit tests, but it’s there for us to ensure that everything’s working. Tell me this, Bob: how would you have shown the customer that the code worked as they expected?”
“I would have let them watch it run -- watch it digest data and then view the results! That seems kind of obvious… how else would they accept it. Sometimes I don’t get you’re thinking, Fred”.
Fred kicked at the base of the column he was leaning against, flattening out a section of gritty soil, flecked with chunks of lichens that had fallen from the monument he now rested against.
“The tests are nothing more than the reflection of the requirements. The function that the customer described in their requirements is accomplished and realized by the collection of the algorithms and code segments that we built and assembled. If each of the blocks of code that we assembled works perfectly with regards to its own inputs and outputs, and if the code segments are assembled in a way that logically generates the expected output from the input to the entire routine, we can safely say that we’ve tested the whole thing when all unit tests are green. Sure, it won’t take into account the details of how much they like it overall, the architectural “goodness” of the solution, or the way it looks, the fonts, the UI, or even the speed, as you’re harping about. But if you remember, the customer never gave us any non-functional requirements like what I’m talking about. What they asked for was an algorithm and that’s what we built.”
Bob put his hands on his hips. “So you’re saying that the customer isn’t going to like what you delivered, and you anticipated that, but you still delivered it?”
Fred sighed. “No, but I am saying that the customer maybe doesn’t know exactly what they wanted when we started the project.”
“Oh, so now you’re smarter than the customer? Sounds like typical engineer ego.”
Fred squatted in the hollow that he had shovelled out with his foot and leaned back against the softer, mossy base of the rock, facing north. “I may be smarter or I may not be. That’s not the issue. The issue is that my job is not just to give the customer exactly what they say or to try to tease every possible requirement out of them at the beginning. My objective is to make a customer delighted with the software that I’ve written for them. It should meet their business needs and it should be useful, and it should be pleasant to use. That’s true of all software -- that’s true of all products. What my customer wants is to solve problems, and that’s why they agreed to pay me money.What I am saying is that right now, my customer has a piece of software in front of them that works. I haven’t said I’m done, I haven’t said that the software is perfect, I’ve merely presented them with a quick release and asked them to take a look and then talk with me about it.”
Bob dropped his arms from his sides and attempted to squat on the hillside, resulting in more lava rocks tumbling down into the ravine below. He finally settled with one foot pointed down the hill, and the other uncomfortably folded under him and pressed up against some unpleasantly sharp looking chunks of obsidian. “I just don’t get it. You want to talk with your customer about slow software that doesn’t look pretty and only runs in an ugly console. Don’t you think they’ll be pissed with what they see?”
“It depends. We talked before I started work on the project. I explained what I was planning to do. I explained that I write software in iterations. They may not be used to doing business this way, but I think that their expectations are exactly in line with what I’m giving them. They may doubt in the approach I’m taking, but I didn’t lie to them and I’m not delivering anything other than what I said I was planning to deliver. They know I’m a professional, and they expect me to do my job.”
Bob raised his eyebrows. “So you think they’ll be OK with this? Aren’t they just going to add a whole bunch of those requirements that you were talking about? Speed? How it looks? How easy it is to use?”
“Probably,” Fred shrugged. “It doesn’t really matter if they add those things now. Like I said, we have a solid base. If they want a different user interface, we’ll add that. If they want it faster, we’ll optimize it. What I’m happy about is that the code that I’ve just delivered actually will work. If they wanted to, they could start using it now.”
Bob started to raise a hand, but the whole swath of rocks below him began to slide. Fred grabbed a branch of a small and unhappy looking Douglas fir that had managed to sprout in the crack between the basalt column and the hillside, and hauled Bob up closer to him. “Watch your step there! That hillside isn’t stable at all!”
Bob muttered a thank you under his breath and sat down as close to Fred as he could comfortable position himself. Both men sat for a while, staring at across the hillside and down into the seemingly endless valleys and terrain beyond. The sun was beginning to set, and off to the west the sky was lit up with a brilliant gradient of reds, oranges, and pinks.
“Did you know that sunsets are pretty mostly because of the pollutants in the air?” Bob asked. “If it wasn’t for humans burning things, the sunsets wouldn’t be nearly as dramatic. I’ve heard that it’s the imperfections in the atmosphere that give sunsets like this so much colors and variance.”
Fred nodded. “I’ve heard that too -- not sure if it’s true, but it sounds reasonable.”
The men fell silent again.
“It’s like switchbacks” Fred said suddenly. Bob looked up, uncertainty on his face. “What I mean,” continued Fred, “is that the iterative process I’m describing is like switchbacks on a mountain. You don’t just charge up or down the mountain. It’s too steep,. the footing isn’t sure. Better to progress from east to west, then west to east, then back again as you slowly gain elevation like we did coming up. It’s slower, but it’s more steady. Plus, if you’re talking about a path that you’ve never been on before, or that hasn’t been blazed by someone in the past at all, the movement back and forth gives additional perspective. As you traverse back and forth you get a better idea of the mountain you’re ascending. You can see whether your path is doomed to failure or if you can realistically reach the summit. Each switchback can feel slow and painful but you’re safe and you’re constantly able to see what the next steps are going to be. You’re constantly analyzing and re-analyzing while still making progress towards the top.”
Bob looked down. “I guess I can see something in what you’re saying. So you’re saying that in some ways, software engineers are like sherpas, leading their customers to the summit?”
“I like that,” Fred smiled. “It’s a lot like that. We know that our customers want a great experience, and we’re the professionals that know about how to get there. We don’t want to promise them a spectacular view only to find that they were hoping to see Mt. Rainier and we we’re leading them up a slope to see across the Puget Sound. We could ask them a million questions about every last detail before they begin in a questionnaire. Things like ‘do you like flora more than fauna?’ or ‘do you prefer trees to rocks?’. But for a lot of the people that climb these hills, they wouldn’t even be sure what you mean. What types of flowers are there? What types of trees? They need to experience a part before they know what they want in full. Best to give them something good soon and then tailor the rest of the hike based off of how they react to what you’ve given them.”
Bob laughed. “I never really thought of it like that. I guess I can buy that. What about all that stuff you were talking about unit tests in code? Does that have a parallel in your analogy?”
“Well, I guess all analogies are somewhat imperfect, but I see things like tests in this case to be the safety that you build in. You may end up crossing a particularly treacherous avalanche chute, like this one, you almost slipped down. It’s better to spend a few minutes either picking a really good path and marking it with a cairn. Or maybe you’re crossing a creek and you spend the time to stretch a rope across to make it a bit easier to pass through. You expect in all likelihood you’ll be back on this path at some point and you want to be sure that the path is safe and trustworthy. Unit tests are like that in a way. It’s not really wasted time. You end up safer the first time around, and also, you have more confidence when you return.”
Bob sighed. “So you think the customer will be happy?”
Fred picked himself up off the edge of the dry, dusty basalt block and stretched. “I think so, I really do.” A crepuscular pika poked his head out of a hole about 20 feet away for a few seconds, saw the men, and decided to call it a night. The sun was nearly down now and the two men had a bit of a trek back to their base camp in the valley below. “You never know though, some customers are never happy. Some people will never be content. Ultimately, you can’t please everyone.”
Bob scrambled up, brushing dirt and shards of rocks from his pants. “I guess not.” he said.
Fred turned and starting walking back down the rocky path that they had come up. He paused, as they passed the first switchback on their trip down. “You know, Bob, even though hikes aren’t always as much fun as I thought they might be, and even though you end up with blisters and scratches and maybe the mosquitoes are biting, I still enjoy them.”
Bob just smiled.
A part of Carl Jung's contribution to the world of psychology, is his concept of "archetypes". From Wikipedia:
In Jungian psychology, archetypes are highly developed elements of the collective unconscious. Being unconscious, the existence of archetypes can only be deduced indirectly by examining behavior, images, art, myths, religions, or dreams. Carl Jung understood archetypes as universal, archaic patterns and images that derive from the collective unconscious and are the psychic counterpart of instinct. They are inherited potentials which are actualized when they enter consciousness as images or manifest in behavior on interaction with the outside world. They are autonomous and hidden forms which are transformed once they enter consciousness and are given particular expression by individuals and their cultures.
Strictly speaking, Jungian archetypes refer to unclear underlying forms or the archetypes-as-such from which emerge images and motifs such as the mother, the child, the trickster, and the flood among others. It is history, culture and personal context that shape these manifest representations thereby giving them their specific content. These images and motifs are more precisely called archetypal images.
I read an interesting article a while back that talked about "personal brands" from this Jungian archetypal perspective.
It's a very fascinating concept. These sorts of constructs are of course nothing more than categorizing or organizing observations into containers from which we generalize. However, I think it's interesting to observe how truly some of the archetypes in the linked article are similar.
I'm finding myself a little disillusioned by how completely and totally consumers have been trained to value style over functionality. The trend has especially hit home for me in the technical world. A demo using the latest glitzy technology automatically wins simply because of the special affects. What the developing team admits that the data is all fake and the interfaces are non-existent, the user ignores this huge gap in development simply because it looks cool. Requirement documents are better if they're visual graphs, charts, arrows, and flows rather than any real meat.
It's a reflection of our culture I suppose. We're interested in all things new, all things beautiful, all things that tickle our emotions, our senses, and our desires. 3D TV is now the rage (or maybe it's fading now...?) -- imagine, seeing movies that are a little bit more like real life. Our culture is one of vicarious, disconnected participation. Why climb a mountain when you can see an IMAX movie that shows a pro doing it?
There's also such an amazing degree of shallowness... Our conversation becomes riddled with catch-phrases, idioms, and cliches until we're hardly more than advanced, random, modern culture speech generators. We're a culture that consumes and consumes with a focus on functionality that will let us consume more -- more efficiently, and faster. We no longer climb a hill to enjoy the beauty and reflect on the Creator but to post the pictures on Flickr in order to fill out our Facebook profile with more pictures. We still create, but we create to consume again.
Is our culture purposeless beyond the next consumption high? Do we strive to make the world a better place anymore for any reason other than fulfilling our own dreams and aspirations? Where are those who will sacrifice their own good, their family's good, for a Higher Cause? Our we really as narcissistic as we seem?
We have tools available to us in life. We can develop those tools-- sharpen and refine them. Many find satisfaction in their jobs, not necessarily because of what we're accomplishing, but because it makes us feel good. We fight wars, not because there is wrong to Right but because rising oil prices will impinge our nation's ability to consume. We give our money to aid foreign countries because in the long run, it could help us. We abort humans before they see the light of day because it's messing with our plans for parenthood. We give to the church because, in our pride, we want our sect to prosper and show the world they're wrong.
So what does all this have to do with style vs. functionality?
A requirements document for any project defines the objective. It's the purpose of the project. What does it do, how fast does it need to be, what sort of interfaces need it support. It's solving a problem-- answering a need. Our culture looks at problems and finds ways, not of accomplishing them, but of making them less painful and more appealing through some spiffy styling. We let the requirement slip for speed because it really would be nicer if there was more graphical display to the user. Yeah, it will slow things down, but imagine how much more fun it will be to use! At that moment, the objective is being redefined. Our own pleasure, longings, desires, and convenience are now the focus.
For some projects, perhaps this was an objective in the first place. An iPhone's driving purpose is not to fill a previously unmet need but to make accomplishing a wide range of tasks more enjoyable and easier. In this case, the initial functionality was focused intentionally at ourselves.
The obvious question is immediately raised: What on earth am I suggesting functionality should be focused on if not improving ourselves? So what if we create to consume again?
I think the answer has to do with precisely one thing: How does one define "good"? Is something good if it makes us happy? What if it makes us happy but it makes someone else equivalently unhappy? Are "good" things, things that make everyone happy? What if happiness incompatibilities exist? Imagine there are only two people left in the world. The only thing that will make either person happy is to kill the other person. Are they both "bad" even if they both want exactly the same thing?
I think a more plausible explanation is that "good" is what is Right. Not just "right" for you or "right" for the majority, but truly Right. An absolute Right implies the existence of some sort of ultimate Requirement Document. We can take action to bring our life into line with the spec. or we can attempt to creep the Requirements here and there to make our life more comfortable.
The following is a not terribly organized set of ramblings that I had regarding augmented reality.
Just for the sake of defining what I'm talking about, Wikipedia refers to augmented reality as:
Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality. By contrast, virtual reality replaces the real world with a simulated one. Augmentation is conventionally in real-time and in semantic context with environmental elements, such as sports scores on TV during a match. With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Artificial information about the environment and its objects can be overlaid on the real world.
There are different types of augmented reality, for example, the interface that is used in the movie Minority Report is a sort of augmented reality interface. More recently, Google Glass was in the news quite a bit with it's interesting wearable augmented reality device. Google's product (and in general, all "wearable" augmented reality devices are more like what I'm thinking about in terms of this article today.
The odd thing is that although I like the idea of wearable "augmented reality" in many ways, it needs to be discrete enough that basically no one but me will know if I'm running it. Part of the issue with any of this tech is that it makes humans seem less human. People aren't comfortable with talking to someone who has a visible camera aperture pointed at them. In reality, I think most people are aware that with the pervasive use of video surveillance cameras and other recording devices are already recording us on a regular basis (almost continuous basis depending on where we are) it's not a whole lot different. There's something personal about it though.
Let's say I'm talking to a friend about how frustrated I am about another person. Maybe my boss, or my teacher, or a friend who I feel has wronged me or others. In these situations, we typically feel comfortable currently because we are in a private area and the discussion can't be recorded without making it obvious. As we move forward into the future, this is going to be less of a sure thing. We will have to trust people to either not record when we ask or to keep their recordings private. This is new for us in the realm of private, face-to-face conversations but it's not new elsewhere. I forward emails that complain or whine about other people and I assume that my friends will not forward it on. (Please note: I'm not justifying my whining about third-parties with other people. This is probably a bad habit that I should break. Regardless, I still do it sometimes.)
The example above that I gave is more about gossip than anything else, but the same could be said for pillow-talk (or "revenge porn" which is becoming a thing) or even things as mundane as business decisions in a company. In the near future, recording devices and other computing resources will be small enough as to be nearly undetectable. There has to be a cultural and technological etiquette established to deal with this properly. What I mean is that in some ways, this is about being polite and civil, as well as trusting and being trusted and at the same time it will likely mean the development of tech to disable, or at least detect, the presence of devices like this in situations where we don't necessarily trust. We already have a work area that we can't bring certain devices into. This works when it's relatively obvious if you're in violation. But I think we'll see new tech that allows an area like our workplace to be ENFORCED.
If I was more of a hardware guy, I'd be looking at a startup to do DETECTION tech for new hardware like this. Let companies like Google and Apple develop our new high-tech augmented reality devices (I can just see Apple marketing it as the iBall!). We're going to need a way for companies and people alike to feel comfortable using it. It's boring technology. Most people would be intrigued by an invisible augmented reality device because it adds value to their life (or they believe that it will). But a device that detects this same technology is more of a necessary purchase to protect yourself than anything else.
There are downsides to creating devices that are intending to identify or disable recording. For example, police or others who are actively abusing their power or authority do no want to have their deeds or words recorded despite the fact that the public should be keeping them accountable. But I still think that there's some good money to be made in this market and I'm interested to see how it develops.
This is not meant as an in-depth review, but really more of a commenting on the book "The Current Justification Controversy" by O. Palmer Robertson, edited by by John Robbins and published by the Trinity Foundation. The book is somewhat technical in nature and walks through the proceedings and circumstances of the discussion and events that surrounded Norman Shepherd's controversial statements as he taught at Westminster Seminary during the late 70s and early 80s.
I picked up this book mainly because of the recent issues with Peter Leithart and the goings-on in the PCA and particularly within the Pacific Northwest Presbytery. I currently attend a church that's a part of this Presbytery and heard a lot about the whole situation during the Leithart proceedings. There are obviously many parallels between these two cases.
What interests me most is that it seems like nothing is new. I don't mean to gloss over the differences between the two men, but I do find it very interesting how the two situations had similar aspects:
- The "experimental" style of re-describing or re-emphasizing traditional orthodox statements regarding core theological points
- The mix of responses to this restatement which included both those that were vehemently opposed and those that thought that the approach was beneficial
- The particular issue of whether the viewpoints (whether right or wrong from a Biblical standpoint) were in accordance with the Westminster Confession of Faith
- The confusion over terminology and language which constantly seemed to mire discussion
I am no theologian. Nor am I that deeply involved in following either the Shepherd case or the Leithart case, but I do find the controversy of both to be discouraging for a number of reasons.
First, I realize that denominations such as the OPC and the PCA do hold the Westminster Confession of Faith (WCF) as their standard. I think it's good to have standards and I think it's helpful to reference and cite them. However, I find it very frustrating when a theological discussion is railroaded because of issues between a theological stance and a standard such as the WCF. The standard should merely be a reflection of revealed, Biblical truth. And now, for the part where I may lose some of my readers, I believe it is in error in at least some parts! If I were to become an officer in the PCA, I would need to indicate on which points I take exception. These aren't huge differences, but they are important. My first main gripe with both the OPC and PCA is that some in the denomination appear to believe that holding any exceptions to the standards makes a pastor, or ruling elder, or other officer less of an officer. It does not matter if that officer can show strong Biblical support for their position.
I'm conservative by nature and I do tend to think that changes ought to be very carefully considered before adopted. But I find it simply presumptuous to assume that the Westminster Divines got it 100% right. Personally, I would hope that over time we as a denomination would have the wisdom to refine our standards. In particular, the standards tend to be reflective of the situation at the time of the writing of the WCF. This makes sense -- the errors and heresies of the time were first and foremost in the minds of those who wrote the WCF. As new doctrines are developed, we should be willing to refine our standards to deal with those that are heretical and to embrace those that are both true and practical. I say practical here because I do not believe that all Biblical truth needs to be summarized in a standard such as the WCF in a perfectly exhaustive way. Rather, I think that the standards should illuminate the important parts and provide clarity to the interpretive framework by which Biblical truths are being explained.
Like it or not, the Bible tends to explain complex theological concepts using a variety of expression and terminology. It can be confusing to examine concepts like the Trinity, or free will vs. sovereignty, or a host of other similar concepts. When we write a standard, we tend to codify the expression of the complex Biblical passages into single statements. This is the purpose of a standard, no doubt, but I believe that it often makes for an incomplete or even worse, an inaccurate statement. When this happens, we ought to be able to back away from the standard enough to analyze whether the standard is in fact accurate instead of simply clinging to the standard. If the standard is deficient, let's update the standard. If the standard is not clear, let's make it clear.
Why is it that modern Protestant, Reformed churches can't update standards?
Secondly, the petty and argumentative way that discussions ensue (on both sides) is very sad. Theologians are human too; instead of coming to blows over a football rivalry as their lay members might do, it honestly seems like certain contemporary opponents on these issues would gladly go at it in a round of fisticuffs (if not for the fact that they tend to isolate their vitriol and pronouncements on the Internet). Since it doesn't seem practical to create an online "Last Theologian Standing" virtual boxing game that could result in some sort of catharsis, it seems like their ought to be accountability and even discipline within the ranks of the church for the sort of uncharitable outbursts that both sides engage in regularly. I don't mean strongly stated theological opinions or even debates. I'm referring to the potshots, the insinuation, the ad hominem attacks, the public vilification that all seems to take place. It's wrong and it ought to stop.
Third, it seems as if both sides ought to be more careful with their definitions. Words are important. The Bible sometimes uses the same words to mean different things. We all do this.We ought to be careful to define what we mean. In order to have a clear discussion about what we are talking about, it is important to realize that we are no all coming to the table with the same actual understanding of what each word means. When we start to disagree about theology, I think that often we are simply confused over what the other person is defining a word to mean. If we use the WCF (or another standard) as the only allowable dictionary for the expression of Biblical truth, I think we limit ourselves. To give an example of what I mean, I'd like you to consider what Paul, Peter, and James would have given as a definition of justification if we were somehow able to interview them. Would they respond with a canned line? Would they even necessarily repeat exactly what they had written in their letters? This is pure conjecture obviously, but I believe that not even the Apostles would give a perfectly unified response. I don't mean that some would be right and others would be wrong, but rather that each would point out a facet of justification that may not reveal all aspects or all truth in a single statement. Perhaps I'm wrong, but given the letters in the New Testament and the actual theology that we do have taught in those letters, I don't think so.
Terrorism is sticky to define. One definition is:
the use of violence and intimidation in the pursuit of political aims.
This seems far too simplistic. Isn't every war about the pursuit of political aims? Don't wars employ violence and intimidation?
the use of violent acts to frighten the people in an area as a way of trying to achieve a political goal
This hardly seems any better. Again, wouldn't virtually all wars and military engagements in the modern world be considered terrorism by this definition? Granted, not all military engagements are broadly staged. Some times the mission is to apprehend or even assassinate an individual. In this case, The scope is small and although we wouldn't explicitly achieve this goal through terrorism, the act
of apprehension of an individual (and especially assassination) is designed to strike fear into the hearts of those who challenge us.
There is a difference between "seeking justice" and "striking fear into their hearts" I suppose. Ethically, the concept of "eye for an eye" is about justice, not fear. There is a natural inclination in our hearts to provide justice to those who are wronged. But justice implies that we have authority to act. So, for example, few people would say that it's terrorism for police to gun down an armed gunman who is actively attempting to shoot other people. Yes, it cements the power of the state, but it's also a just act by the state. At this point, I believe it's more about motives. Someone who is carrying out the execution of justice can do so in a very negative way -- a way that is designed to make people fear them.
Here's another definition:
The unlawful use or threatened use of force or violence by a person or an organized group against people or property with the intention of intimidating or coercing societies or governments, often for ideological or political reasons.
Unlawful is a horrible word here. Unlawful under what law? Was the American Revolution the act of terrorists? Perhaps... The word becomes meaningless in conflicts where the word is applied. Coercing is a good word though, I think. The idea here is that the act of violence is intended to have a side effect that brings terror to others. So, for example, arresting a suspected evil-doer is not terrorism, but holding a known evil-doers family hostage would be terrorism. Because of the horror of war, there are always small acts of terrorism committed by this definition. And sometimes not just small acts. The bombs dropped on Japan, for example, were specifically designed to coerce the Japanese into surrendering the war. One could argue that this was "necessary" to end the war, but even so, it does not change the action.
The problem I have overall with the word is that it is a word employed to demonize ones opponents. If a person says that they are a soldier in the struggle against tyranny, the tyrant can simply call them a terrorist. It's a word used to isolate and delegitimize a violent action. It's always the "other guys" that are terrorists. It implies a certain amount of cowardice.
I'm not advocating violent action against any government. I even have issues with the American Revolution (I might have even been a Loyalist, or at best, not a very enthusiastic supporter). I'm certainly not trying to say that those who advocate violent actions are right and just, but rather that perhaps our own military is not right and just in how they act. Radical Islam justifies killing of innocents through the concept of jihad. Americans call this "collateral damage".
I'm sure there are many within the government who agree with me. It's a constant problem for any power engaged in warfare: At what point do we stop acting as the instrument of justice and become an instrument of terror?
The recent examples of ISIS in the Middle East beheading, burning, and in other horrific ways, slaughtering those that they capture (in most cases non-combatants) is a perfect example of the sort of horror that I believe truly should be defined as terrorism. This is pure evil and it doesn't take much to convince anyone of this. However, our reaction to such horror should not be a sort of Western jihad against these "infidels" -- it should be a focus on bringing them to justice. It places a much harder burden on us. In some ways, it seems impossible. But I believe it's the right thing to avoid falling into the same endless cycle of back-and-forth bloodshed that terrorism on BOTH sides tends to instill.
What do you think? How can we properly respond to terrorism in the world without simply adding to it?
I've talked about this in a slightly different form before, but I'm a firm believer in the value of rest. There are obviously many scientific studies that show the effect of rest on both physical and mental well-being with everything from the rebuilding and repair of muscle tissue to the development of synapses in the brain. It's an obvious need that we all have as humans.
However, although most humans get some sort of routine of sleep at night and take some amount of rest from normal work activities, I'd like to suggest that you challenge yourself on how you rest. My hypothesis is that many people can gain a significant amount of improvement specifically in their mental and creative capacity through a more intentional application of rest.
Our world is busy and hectic. I'm sure that humans have said this since the dawn of time. When the first nomadic tribes started using horses instead of just walking, I'm sure there was some naysayer in desert garb who said "Our lives are so BUSY now -- we have to take care of horses now in addition to everything else." I don't mean to suggest that we're unique in having busy and hectic lives. I think that many, if not most cultures have felt similarly in the past. However, I'm suggesting that we attempt to enjoy our life by avoiding both the temptation to muscle our way through it or to run away from it. It doesn't matter what it is that makes us busy or what it is that makes our days feel hectic, we need to set apart time to retract from this without completely giving up.
I am a Christian and I do believe that prayer is a powerful and meaningful "retreat" that is, in itself, a form of rest. Christian prayer is the idea of direct communication with a loving God who we claim as our father. However, for Christians or non-Christians alike, I believe that there is an additional rest or retreat that is valuable.
First off, I'm guessing that most readers at this point will make the assumption that I will now talk about "shutting off the cell phone" or "logging off of Facebook" or "turning of the TV". These are all commonly called for activities. The idea that somehow retreating from technology will solve our problems. What I'm suggesting isn't necessarily in conflict with that, but I believe it's far more purposeful. Technology does not make our lives meaningless or boring or depressing or hectic. However, any technology that we use day-in and day-out is something that we should take a break from periodically. If you do not use a computer except on the weekends, I think the idea that you should "take a break" from using it is a little silly.
The purpose of rest and retreat is to allow for a certain sort of healing and recovery. Just as with muscles, we do need an opportunity to not use technology if we expect to be better in using it in the future. We need to identify areas where we can be "mentally muscle-bound". These are areas where we have developed skills or abilities that are perhaps not well balanced or contrasted with other skills that are being ignored.
For whatever reason, our brains have the strange ability to compensate for senses that are unavailable. Although most people will initially stumble and have trouble operating completely in the dark (as if they are totally blind), over time, the other sense can be honed to fill in a large part of the gap that eyesight previously provided. It seems natural to me that the same applies in other skills and abilities. We should ensure that we are not reliant on one "sense" so much that we are helpless when it becomes unavailable. By "sense" here, I mean some piece of technology, some ability, some tool, some technique. If the only way we typically communicate is through writing an email, we should consider taking a break from email in order to hone our ability to communicate in other mediums. If we are used to reading relatively short articles or blurbs of information (such as blog posts and mainstream media articles) we should take a break to focus on our ability to dig deep into lengthy and complex works.
Self-improvement involves constant analysis of where we are. Be aware of what you do often, and what you do infrequently and when possible, take breaks that allow you to hone those skills or those areas where you feel you've fallen out of practice or familiarity.
And when you need it, never think twice about taking a nap. We all need rest.
Since just a few days after the New Year, I've been on a new diet. Thus far, it's been a success in the sort of way that I think all good diets should be:
- I have been losing weight at a fairly consistent rate (about 2lbs per week on average)
- My energy levels are the same or higher than before the diet started
- I do not constantly crave or fantasize about food
- I can feel and notice an increase in my stamina as I exercise
Some diets I've tried in the past have resulted in rapid weight loss (maybe 4-5lbs a week) but I was unable to stay on them consistently. It felt like like a type of torture and every day was horrible. I felt tired and weak, I constantly thought about food, and I just struggled every day. This was especially the case for the Atkin's diet. In one case (many years back), I was able to lose a substantial amount of weight over about 3 months, but the weight came back pretty quickly within a year.
So what is my diet? It doesn't exactly have a name. It incorporates elements of the Atkin's diet, some of the grain-free stuff, and intermittent fasting -- basically, whatever I've found works for me. Here are the basic rules:
- Don't eat before 10am.
- Don't eat after 2pm.
- Minimize processed foods and sugars
- Minimize wheat (and grains overall)
- Mostly focus on protein -- no need to avoid fat from meat sources
- Eat slowly and stop when I feel full
- Eat meals, not snacks during this time (two small meals at say 10am and 1:45pm are fine, but avoid grazing)
- Lots of water
- Nothing with artificial sweeteners
- Coffee, tea, other non-caloric drinks are fine outside of the "eating hours"
This is the basic set of rules. I do make a few adjustments to this core set of rules. The adjustments are really to cover social situations where diets make social interactions frustrating or awkward. Eating with family and friends is a wonderful and enjoyable social experience, and I don't think that a diet should force that part of our life to be radically altered.
- No more than once a week, I can shift the times from 10am-2pm to 5pm-7pm provided I go to bed after 11pm.
- I have diet break days (where really anything goes) once a month (on average). This allows me to participate in things like holiday celebrations and birthdays with the family.
- Once a week, I allow myself a small amount of alcohol (whiskey, wine, cidar -- I try to stay away from beer) in the evening provided I go to bed after 11pm.
The social "adaptations" to this diet are not required, they're simply there to allow me to feel a little more human with other people. It's not as if I have to shift my schedule, or have a drink with friends, or even have a break day. In fact, I've found that on this diet, that I usually really restrain myself in these situations. It gives me the flexibility to not be the one guy who never has a glass of wine. But if I'm really not feeling like some, I can (and sometimes do) skip.
I've lost about 22lbs so far.
I previously had a number of issues related to diet:
- Constant or increasing weight
- Back pain almost every morning
- Trouble sleeping at night (sometimes)
- Lethargy in the afternoon/evenings
- Symptoms like those of IBS
These are basically gone.
The one change in my habits is that I sometimes do take an afternoon nap. I sometimes do feel a little tired about an hour or two after eating my mid-day meal. Also, mentally, I feel that it's quite helpful to nap for 30 minutes to an hour each day.
This diet works for me. I am not saying that this diet will work for you. I believe that one of the main problems with diets in general is that human bodies are very different and it's difficult to find something that works particularly well for just YOU.
A quick note about the statement about grains. I'm not a "gluten-free" person, nor do I think all people ought or should avoid grains. However, from my own anecdotal evidence, it seems I do better without grains in general. Rice is the best for me -- I can handle that pretty well. Quinoa also seems OK. I mostly avoid corn, but I have it in small amounts sometimes. Wheat seems to cause the most trouble for me. But oddly, not in all forms. For example, tortillas seem fine in most cases. In general, it seems like the puffier the bread, the worse it is for me. Those sourdough loaves or the sandwich bread at Subway are the worst. It may be that I'm actually sensitive to a preservative or something related to yeast, but I'm not really sure.
I think it's a good idea to spend time learning about what works well for your body in particular -- not just reading books, but experimenting on yourself. What can you handle and what can't you handle? How do different types of foods make you feel? Reading books and doing research is good, but ultimately, it's about what works for you physically (so you can feel good) and what works mentally (so you can stay on your diet).
Self-confidence can be a great thing, so long as you are truly capable of those things that you are confident about.
Feeling good about your own abilities can be very powerful. And correspondingly, lack of self-esteem can be a real problem. We ought to be confident with others when we're dealing with things that we are experts on. If we are a successful business-person, we should feel confident about talking about how our business runs and how we got it to where it is.
Feeling as if our own opinion is intrinsically worth less than others is definitely an issue and I'm sure many people struggle with it. But I think that overall our culture struggles much more with misplaced self-confidence than with a self-esteem problem.
Children are often taught that their work is excellent, even when it's really not when compared to their peers. They are taught that they are incredibly smart, when they really aren't.
The word "excellent" should really mean something different than "good". If I have a good employee, I mean that he performs his duties well and does a good job. This does not make him excellent. Excellent implies that he truly stands out among employees as a particularly exceptional character.
In my company, as in many companies, we have annual performance reviews. During these reviews, our manager assigned a number between 1 and 5 that indicates our performance for the year. I had one manager who consistently gave 4 and 5 ratings to all his staff. His explanation was that "we only hire the best and the brightest, and you all are exceptional employees". This really isn't the point of the evaluation. The point is to show which employee or employees from the larger set of all employees truly are exceptional. Most employees should probably be given a 3 rating -- the description next to this number indicates that an employee with a 3 rating "consistently performs his duties as assigned and meets deadlines" (or something to that effect). In reality, that describes most good workers. People that you want to stay with the company forever if possible and that are a delight to work with and spend time with.
To make everyone special is to make no one special. The point of lifting some people up as experts in their field, or people deserving of honor and reward, is because they truly outshine those around them.
But to take this back into the realm of self-confidence, I think that far too many people in the modern believe that they have amazing "skills" when really their skill is mediocre at best. Consider those who have written some code in their life. Programming has become a very common auxilliary task for many professions. People who work with computers in various fields find it useful to spend some time writing code of some sort to automate, or simplify, or reduce complexity in their ordinary jobs. This is great for them, but they are not experts in most cases.
I'm here to burst some bubbles. Spending a few hours, or a few days, or even a few weeks learning a new skill does not make you an expert. Furthermore, going to a university for an undergraduate degree, or even a graduate degree, does not make you an expert. Even spending years of your life exercising a skill does not make you an expert.
Let me explain what I mean by an analogy: Playing the piano is a lot of fun. Many people enjoy it and many people are even good at it. There are very few experts. I cannot simply become an expert because I practice each day or because I go to an expensive school or because I buy expensive pianos.
Becoming an expert at something is a combination of the effort that you put into a skill and the inherent talent that you possess for that skill.
Keep in mind, it's not enough to just spend time with something. Some people will never excel at something even if they are passionate about it. I've read books by authors who have written for over 40 years whose books are still drivel. I've seen construction work done by people with years of experience in a variety of contracting work that's still awful. I've seen software that's written by veterans of coding who had to punch out their code on ancient computer systems that's simply terrible.
I'd encourage the following:
- Don't expect people to respect you and your work because you were educated in a particular field. Having an MBA does not mean you know how to run a business.
- Don't think that because you work in an industry, you are therefore an expert in that industry. There are far too many people in jobs that they are lucky to have, and really aren't qualified for.
- Critique the praise you receive. Are they simply flattering you because you can do something for them? Are they ignorant of your actual skills? Simply because many people praise your abilities, does not mean that you are exceptionally good at what you do.
But, you might ask, what's the point of being all negative about your own abilities? Am I just being a killjoy that encourages morbid introspection and wants you to run yourself down professionally?
I think the best answer is that you cannot become an expert at something when you already believe you are one. We learn best from our failures. The smartest people I know are people who are painfully aware of many areas where they lack knowledge and ability. They actively pursue these areas and attempt to fill gaps where they know they are less than they can be. They become experts in part because of their own innate talent, but also because of their dogged determination.
So be humble. Never assume you are the smartest person in the room. Never talk down to others when they share an idea. Be ready to learn from anyone. Sure, there will be idiots that you interact with in life and people that rapidly show themselves to be ignorant of areas that you are very knowledgeable about. You certainly are smarter than some people. But if you start with the idea that you are very smart, very experienced, and very wise, you're very likely to get to no smarter, no more experienced, and no wiser through your interactions with others.
Also, it's just annoying.