Updated Review of LLM Based Development

I tried developing using GPT mid-2022. While I was amazed by the potential, I was not impressed enough to add it to my daily development workflow. It fell off my radar as a development tool, outpaced by a far more impactful use of text generation and image creation. A toolset that has significantly changed my day-to-day productivity.

Recently, a group of peers convinced me to give coding with LLM another shot. They loved using A.I. to develop code in languages they were not comfortable with. Or, as a manager, a way to better explain what they wanted to see in a project from their team. What convinced me to try it again was their highlighting of how well the results were formatted, syntactically correct, and well documented from the get-go. While they admitted the development of code may not be faster, the prospect of all those benefits culminating into a cleaner, well formatted final product convinced me to develop with GPT again in earnest.

I began my reexamination of the tooling via demos, as we often do. I was very impressed. I converted code into PowerShell (which I don’t know well) and re-created functionality I came across in weeks prior. I was so impressed, I showed my team examples of how the work they completed in the past could’ve been done with the help of GPT instead.

After those successes, I committed to using GPT to develop. Over the next few weeks I made sure to use it on projects I was working on.

While the technology showed incredible advancements since I tried it last year, it still hasn’t become my go-to in the same way using ChatGPT has for writing content.

Here are some areas I was both impressed with but left wanting:

  1. Code completion
    • Pro: Impressive. The look-ahead came in handy similarly to code-completion functionality of the past, with the added benefit of more contextual relevance that was not just a “cookie cutter” snippet.
    • Con: It gave me a useless hint quite a bit and I found myself typing almost as much as before with the incumbent “dumb completion”. I think it is because my mind is moving ahead to what I want the code to do, not necessarily what it is doing on the console at the moment. In the end, it is using patterns to make predictions. So, any new code that is a result of changes to my approach, or my on-the-fly reworking to fix a bug (that was not due to syntax issues) took as much time to develop as non-GPT-based code completion.
  2. Testing
    • Pro: When it comes to testing an existing application, the A.I. hits it out of the park. Ask it to “write a test for myFunction() using Jest” it creates an awesome base test case that I would have hated to write for each function.
    • Con: Some of the same issues outlined in the “Code Completion” and Functional Development” can be problematic here. It doesn’t always create a great test for code I haven’t written yet. (i.e. TDD) However, if the code is already there, it uses that context I’ve provided and its LLM to unpack what it the function is suppose to do and generate all the mocks and assertions needed to create a well written unit test.
  3. Functional Development
    • Pro: Much like helping me get past the dreaded blank page in text generation, I found it more useful than Google searches and StackOverflow reviews to develop a series of functions I wanted, without developing entirely from scratch. Better than code snippets, the snippets A.I. gave were pre-filled based on my prompts, variables, and existing object definitions. That was appreciated. I didn’t have to review the documentation to tweeze out the result I wanted. The A.I. pulled it all together for me.
      Additionally, the fact that it almost always has an answer goes under appreciated in other reviews I’ve read. The part that makes it so advanced, is it fills in a lot of grey area even if I (as a stupid human) carelessly leave out an instruction that is critical in generating a solution. If I were to get the response, “could not understand your request” due to my laziness, I would never use it. The assumptions it makes are close enough to my intent that I am either using the solution, learn a new path, or see what my explanation is missing so I can improve how I communicate with it.
    • Con: The end result did not work out of the gate most of the time. Sometimes it never got it correct and I had to Google the documentation to figure the issue. This was due to what I think was more than one documentation existing for various versions of the library I was using. I’m not sure. While the syntax was correct, the parameters it assumed I needed, or the way the calls were made to interface with a library/API led to errors.
  4. Debugging
    • Pro: Per the “functional development” points above, I was impressed at how I could respond to a prompt result with “I got this error when using the code above: [error]”. It recognized where it went wrong, and attempted to rewrite the code based on that feedback.
    • Con: Each response had a different result than the original. So, instead of fixing what I found was wrong (like a param missing) it also added or removed other features from the code that were correct. This made the generated result difficult to utilize. In some cases, it could never understand the issue well enough to generate working code.

One limitation I am not too surprised about, and am hopeful to see evolve in the future, is the AI’s understanding of a project in its entirety. Done so in a way that context is used in its function creation, making the solutions it provides “full stack”. Imagine a Serverless.com config, for an AWS deployment, that generates files and code that creates and deploys workflows using Lambda, DynamoDB, S3 and so on, all being developed based on prompts. With the yearly (and more recently) weekly leaps, I don’t think we are to far away.

As of today, I find myself going to GPT when filling in starter templates for a new project. I find it’s a much better starting point than starting from cookie cutter function as I set up my core, early, “re-inventing the wheel”-type, skeleton.

For example, I will use a Gitlab template for my infrastructure (be it GL Pages, Serverless, React, nodejs or Python and on and on), then fill in the starter code an tests via a few GPT prompts, and copying them over. Beyond that copy, I find myself detaching from GPT for the most part, and returning to occasionally “rubber duck” new framework functions.

Examples referenced above

Here I asked for a non-3rd-party use of promises (only await/async) which worked. Then I asked to modify the code by adding a zip task, and it re-introduced the promisify utility when it added the zip process.

Steps in overcoming the great blank page. Write more, write well, write often.

I’ve experimented with a variety of methods to provoke myself to write more, and with any luck, write better. My latest writing routine is a combination of using my Apple iPad Pro and Apple Pencil, Text-to-Speach tools, and A.I. content generation. Through this latest addition I am writing more, writing more confidently, and feeling more free than ever to set loose the thoughts swirling in my mind.

Step 1: I capture thoughts and ideas in stream-of-consciousness in my Nebo app. If a new thought pops up mid dump, I scroll down the page and jot the branching thought down, then return to my previous cursor and carry on.

Step 2: After I’ve exhausted my train of thought, I review my hand-written notes, making corrections via the A.I handwriting interpreter built into Nebo and expanding upon incomplete or unclear ideas. I avoid self-criticism about what I am saying. Much like throwing ingredients into a crockpot, I am not concerned with specific measurements.

Step 3: Next, I head to a product that leverages GPT like Konch.ai or OpenAi’s ChatGPT. I trim, paste, move, and add to the A.I. responses. In some cases I ask GPT to merge the note into a single post. Other times I am happy with the foundation and move into editing.

I find that GPT is much like an always-available writing coach, always willing to review what I have created and give me back something with more polish or a suggestion on a new arrangement. I can disagree with it, “the third paragraph misses the point of how X affected Y in the story.” Or, delete sections and start over, “remove the intro”.

For whatever reason, it is easier for me to try and write something that “prompts” the A.I. to understand me and what I am trying to say, as opposed to writing to myself through a blank page and knowing that the feedback won’t come until after I post the entry. Much like with coding in an IDE, or learning languages via Duolingo, I get a tremendous amount of motivation and satisfaction from instant feedback.

Step 4: Now comes the fine tuning. At this point I have separated from A.I. and I’m excited and engaged with my story. Sort of how one gets going when they are surrounded by a group of friends interested in the same subject and a couple beers deep.

At this point the heavy lifting of the slabs of marble into my studio is done, the bulk of the shape is formed, and I am left with my small chisels to smooth out the edges and bring the work to life.

Step 5: I have heard reading one’s own writing aloud improves their work. I know they are right, but reading my own post aloud over and over is uncomfortable and exhausting. It is like being a bit shy of myself. But, the advancements in text-to-speech have come a long way. Tools like AWS Polly, or this free, web-based tool called TTSReader, convert my text into a voice that is more pleasant to hear than my own. I copy and paste my text, sit back and listen, and correct any mistakes I’ve missed up until now. The fact that the A.I. reads it exactly as it is written and punctuated makes mistakes sound like a scratch on a chalkboard.

Step 6: To complete the post, I either find an image online or use Konch.ai to leverage AI-based image generation via amazing technological breakthroughs like Dall-e, and Stable Defusion.

The key to advancing my writing proficiency is overcoming roadblocks that keep me from the work. Just as a new pair of sneakers get me wanting to run again, tooling like this gets me excited to write, and before I know it, I can’t stop the words from flowing to my keyboard. I no longer fear the blank white page! By using AI to remove the things I am not the best at naturally, I am able to write, post, and share my ideas much more efficiently. The end result is that I am writing more, making clearer points, and feeling more confident in my writing abilities. So, while I admit I am not respecting our literary history by slogging through the use of my feather pen and an ink blotter, I am perfectly happy skipping ahead to the fun parts to get the job done.

Don’t sell the sale

Being on either end of a sales call can be tricky. The aim is to either engage with potential customers and sell, or for a buyer, get the transparency needed and end with the best bang-for-the-buck. One of the most effective strategies adopts the “simple” art of not talking. It may sound easy, but the drive to make conversation is deeply embedded in our culture. Filling space, or finding it awkward to rest in open space, can push us further from our goal. There are a few simple ways you can remember to avoid falling into conversational land mines that work against your best interests.

Selling the Sale

The first example of this mistake is described by my group of friends as “selling the sale”. One of us will try to convince the other to take part in an activity. Say, you want to convince your friends to go on a ski trip. On the call you get through the first couple reasons you’ve prepared to convince them. Your friends unexpectedly agree. But, you are so excited to present the rest of your “great reasons” that you continue on. Even after they have agreed, you continue pitching the idea. In that moment you may hear my group call you out: “Hey man, I said yes. Don’t sell the sale”. Why do my friends call this moment out? Well, once a person says “yes” they are “in”, anything else out of your mouth can only work against you. You have gone from summiting a mountain of agreement to barreling down a hill filled with land mines. In short, once an agreement is reach – Don’t sell the sale and create opportunity to lose the ground you’ve gained. Don’t forget that your goal is to convince them, not show them how great of a sales pitch you can make. In other words, don’t sell the sale, sell the product.

Silence is Powerful

Another advantage of creating space in a conversation is humans have a bias to assume silence means “disagreement.” It means no such thing. For example, I was once on a call with a vendor. The sales rep ended their pitch and gave me their price. I said nothing. Honestly, I had no idea if it was expensive or not. Moments later I heard “…but we can do cheaper if that’s too high.”

I have seen this uncomfortable silence change rates, contracts, and features with not so much as a whisper.


By allowing statements to sit – and breath – you allow the other person to find time to air out what is running through their mind, be it doubt, logic, or ethics. At the end of their thought process they may realize their asking price is too high, for example, or their proposed agreement is too strict as their conscience felt icky once the words left their mouth and they wish they could take it back. This approach can sound like a silly game, but it is not. It is simply allowing non-verbal communication to fast-forward any snake oil quips or rehearsed phrases. It allows the party to turn their asks into a discussion. Best of all, it required very little added effort from you.


Finally, while I haven’t researched it, I have found in practice that giving a person space to speak creates a sense of comfort. They remember the experience having went well.

Whether it be by building rapport, establishing a connection, and creating a sense of trust and collaboration, you can make your goals on a call much more achievable by practicing the art of silence. The next time you are on a call, try to allow for longer gaps of silence and see the difference it can make.

The Future of Work: We are not giving up, we are finally letting go

In our rapidly advancing technological age, it’s not uncommon to hear discussions about what jobs and tasks will be taken over by machines. I tend to look at it from an flipped perspective: What if we assume every task you deal with today is meant for machines. Humans are born burdened, unnecessarily, with repetitive and labor-intensive processes of work. Our ancestors could not advance without physical labor. This is a temporary state that we deal with until we figure out the best way to, inevitably, hand these tasks off to machines. From the beginning of human history, we have always been simply the “in-between”.

Reframing our problems and ideas allow us to remove walls that are only set by tradition or cultural perspectives. Once we find ways to break free from those binds, we can more easily identify ways to advance. The goal is to increase our happiness and ease of existence, not savor the burdens we are born with, or that have been passed down.

Many people are familiar with the concept of the “mechanical Turk,” where human labor is used to perform individual tasks instead of relying on a machine. However, isn’t everything a mechanical Turk? Isn’t that definition backwards? Isn’t every task not done by a machine simply an example of us imitating machinery? From making eggs to driving to work, filling out spreadsheets, targeting investments, and delivering a baby, these are all tasks that could be broken down into simpler repetitive tasks. We are not losing tasks to machines, but freeing ourselves from machine-appropriate tasks so we can do and live as freely and unburdened as possible.

By assuming that everything is meant for machines and that humans are merely the in-between, we can more easily identify the tasks that should be handed off to machines to improve our quality of life. This shift in perspective can help us reframe problems and ideate new products and procedures that are more efficient and beneficial for humanity.

Hitler’s Shadow: The warning signs are not hyperbole

The comparison of a person to Hitler is a frequent refrain to demonstrate how intolerant, narcissistic, or brutal a person in power can be. However, this comparison can often be seen as an over-exaggeration or hyperbole due to Hitler’s infamous reputation. What people often forget is that Hitler wasn’t always the same person responsible for the systematic murder of millions. Before rising to power, he was just a small-town bigot with intense and delusional beliefs about himself and others. He worked aloud, and in secret, playing with truths and rules to finagle his way into a position where all his dissolutions became an infamous force upon reality.
The rise of Hitler is a cautionary tale, not just about how far a poisonous thought can disseminate the world, but about the dangers of overlooking or underestimating the potential for evil in its infancy. Hitler’s rise to power was a result of a series of oversights and underestimated threats that allowed someone with his particular mindset to gain influence and ascend the ranks.

In November 1923, Hitler and his then fringe Nazi supporters attempted to overthrow the German government in the name of nationalism and strength. The coup failed, and Hitler fled the scene leaving his sycophants behind. Two days later, Hitler was arrested and sentenced to five years in prison. During his imprisonment, Hitler sought to create an image that would convince others of the god like qualities he saw in himself. To do so, he created a fictional persona, writing a book under an assumed name that lauded Hitler as the savior of Germany. (All as an unbiased 3rd party, of course.) The egotist then used that momentum to stoke flames of anger, convincing the German people that he had solutions to the very problems he manifested. It was a vicious cycle: Create the problem, blame those you hate for the problems, assert you are the only one with the solution, and repeat into fervor.

Unfortunately for those alive in 1923, Hitler was only 34. This was just the beginning of his 20 year campaign in politics where he would continue to manipulate, deceive, self inflate, and dismantle the world. The world lacked the priori knowledge of how such manipulative thinking could lead to a global meltdown. They did not have a “Hitler” to learn mistakes from.

The failed “Beer Hall Putsch” (coup) of 1923 was nothing compared to the atrocities to follow later. It is the mindset and characteristics of this type of individual that are the warning signs. Signs the surrounding community is bound to appreciate and resist. If we wait for someone to replicate Hitler’s final years as the only true comparison, then we are relegating ourselves to always acting too late.

It is the fervent blame on others, the lack of culpability, and the “I am your savior” mentality that we must shine a light on when remembering the failure in stopping the Nazi Party’s propaganda. The world has a solemn commitment to “Never Forget” the atrocities of the Holocaust and the hate filled speech of WWII. To do so, we must not only honor the lives lost but remember how one was able to rise to such levels of brutality. If the only time “we remember” comes after a menace has risen to power, then it is already too late.

The rise of Hitler serves as a reminder of what can happen when we turn a blind eye to the warning signs of a narcissistic and power-hungry leader. Hitler gained power because people underestimated him and allowed him to climb the ranks. The world has made a promise to “Never Again” allow someone like Hitler to rise to power, but the key to keeping this promise is recognizing the characteristics of a leader like him.
These characteristics include a tendency to shift blame, a belief in themselves as the only solution, and a talent for creating angry polarization in order to gain importance. If we only remember to pay attention to these red flags once a group as garnered a large following, we would acting too late. Then we’ve learned nothing from the past.

These traits and events are not relegated to ancient history. They persist in our present-day world, as evidenced by riots at capitol buildings, unchecked leaders who have become drunk on power, and the rise of self-anointed saviors who claim to have all the answers. They thrive in a new age of sycophants and fear-mongering, taking root and festering like a cancer. But the solution is not to look for a false idol to save us from a world that is only viewed through the lens of those that are the “problem”. The power must flow to the people, by the people, and for the people.

We must recognize the warning signs and stand up to those who seek to manipulate and control us. We must be vigilant. By doing so, we can stop the cycle of history repeating itself and move towards a brighter, more equitable future. We must remember that the strength lies in our unity and our unwavering dedication to the principles of freedom, justice, and equality.

Yes, the comparison of a person to Hitler is a serious one and should not be taken lightly. That being said, it is important to remember that Hitler was not always the monster he is remembered as, but rather a product of a series of oversights and underestimated dangers. Humanity created a path for a person to spread a heavily distorted philosophy of your neighbor masked as a “strength”. We must remember the lessons of the past and remain vigilant against the rise of any individual with similar tendencies and beliefs. Only then can we truly honor the commitment to “Never Again” and “Never Forget.”

So, the next time you hear someone compared to Hitler, don’t just brush it off as hyperbole. It’s important to recognize these comparisons against a 35 year old Hitler, still acting well below is potential for evil. Keep an eye out for the warning signs of a leader who could lead us down a dangerous path. By recognizing these traits, we can honor our commitment to ensure a better future for all.

Finding hidden talents lost to your childhood

The reason why I suck at writing, hate reading and have never been able to pick up languages – and how I proved myself wrong.

I’m sitting in my dimly-lit, third-grade classroom. My mom and I are sitting in hard, vibrantly-colored, plastic chairs. My English teacher, who is sitting across from us, is your typical sweet southern grandma, until she opens her mouth.

“I know I’m not supposed to say it,” she says anyway, from her wrinkled, fuzz-covered lips, “but, if I were you, I’d go home and give him a good spanking.”

I don’t know if my mom nodded, ignored it, or what came after, other than the feeling of betrayal from my teacher, mom, and the educational system. I wasn’t a bad kid or bully. This discussion wasn’t the result of my lobbing spitballs. This was Mrs. Manard’s solution to my “C” level performance.

Looking back, I tried to do the things required to excel in school but, try as I might, I couldn’t do them the way I was supposed to. I disliked the slow pace of English class and reading large books that seemed irrelevant to my life. I already knew how to read, write, and speak. Knowing the rules as to why one should never end a sentence with a preposition felt unimportant.

Without knowing it, Mrs. Manard redirected my educational trajectory, and, by 10 years old, I decided, “I suck at English.”

My foreign language class wasn’t much different. Aside from having a much nicer teacher, I didn’t do well memorizing all the rules. There was no satisfaction in the months of repetition required to eventually say, “Your cow is fat.”

I had another epiphany at 12 years old. I was never going to be able to pick up new languages.


Some stories, like mine above, become obsolete in adulthood, but never get a makeover.

Being a “bad speller,” “bad at math,” or “not being witty” are a few examples of stories you may have calcified during childhood. They are either told to us, beat into us, or remnants of unwanted consequences we had to endure.

These stories are as relevant to us now as a favorite toy or blanky. They are anchors that swaddle us in chains, leaving us comfortably limited. We see these features as foregone conclusions, but, somehow, we are unable to remember when these features formed or when we last questioned them.

Maybe it’s time to update our stories.

The story above is part of my history. It made me who I am today. But, it’s based on old experiences and, therefore, outdated. 

If a 10-year-old kid walked up to me now and told me how to live my life, I would think it was a joke. Yet, somehow, my 10-year-old self is still telling me how to respond to my environment. 

I can’t continue to rationalize this logic. It is time to update my stories and make them more relevant to my current environment, social circles, and interests. It’s not about changing who I am, but ensuring I am not limited to who I thought I once was.

Like the rest of the world, the isolation of COVID provided me with an opportunity to pause, reflect, and assess. An opportunity to dissolve the negative assessments of my capabilities. This simple reframing immediately altered my perspective. I went from reasserting my shortcomings out of habit to searching for ways to reexamine them. 

Take languages, for example. Soon after I took this new approach, I caught myself responding to the question,  “Do you speak any other languages?” with a canned,  “I am good with learning software languages, but have never been able to grasp foreign ones.”

The first time I used that response was in high school. High school?! It has been a reflex, hidden, very literally, under my nose for decades. 

I decided to test the theory. I began looking for language apps. If one didn’t suit me, I tried another. I found groups at work that were studying languages (turns out a lot). I Googled hacks to learn languages quickly. I found platforms that connect users to native speakers around the world, so they could learn for pennies on the dollar. I kept what I liked and threw out what didn’t work for me.

A year later, I’m speaking French and Spanish at an Intermediate level. I now see the world in a new light. Like a veil being lifted around me, I now recognize the lyrics of foreign songs, follow dialogue in foreign flicks, and eavesdrop on tourists at my local coffee shop. I didn’t just learn languages. By challenging my old thinking, and with little effort, I illuminated a new world.

Enthused by the results of this formula, I applied it to my “sucking at English” and so many other false truths weighing me down over the years. 

Through this experience, I had an epiphany: Maybe, I have always loved English and languages. Maybe, I just hated a few child classes that unfortunately bound me to a false narrative. 

Let’s close the book on these old narratives and make room for a new, liberating reality.

Redefining your reality 

Introspection is paramount in discovering and redefining outdated stories.I had to catch myself repeating old facts to others, and then determine whether it is outdated. 

That’s your clue. 

Then, reset and re-create that truth from scratch.

  1. Catch yourself. When you hear yourself assertively self-deprecating what you’re capable of, replace it with, “I haven’t taken time to be good at it.”
  2. Step back and see if you can pinpoint when you formed that opinion. Who were you then? Is it possible you’ve evolved in other ways since then? Are the issues that blocked you still present now?
  3. Cut out what’s no longer serving you. Do you spend hours on the phone or TV? Maybe cooking everyday is a burden and ordering out once a week removes it. Can you trade a day, hour, or activity to investigate this question? Maybe block your work calendar for 30 mins, one day a week, or add an activity to your wake-up or sleep ritual. Maybe you mow the lawn one fewer time a month, and it grows just a bit longer. These are a few trade-offs you can make to open yourself up to new possibilities. Personally, I deleted all my social apps and replaced them with Duolingo.
  4. Chose one thing from #1 and start researching ways to engage it for a few minutes gained in #2.
  5. Give it 6 months and see if your story changes.

From a writing hater, to a writing lover

Where the hatred started

Writing has never been easy for me. It isn’t for a lack of wanting. My experience in school wasn’t helpful.

Since I can remember, I yearned for the ability to get all my thoughts, observations and theories onto paper. My hands just couldn’t keep up. When I took a shot at writing quickly, the results were illegible. When I took the time to write cleanly, the thoughts would slip through my fingers.

I couldn’t strike a balance and wasn’t willing to push through the torture of building skill through the slow, methodical, practice of writing and rewriting my ABCs. I neither had the penmanship nor the patience. And, with that, I could only assume writing wasn’t my thing.

This frustration as a child turned into a hatred toward writing, and that hatred turned into avoidance.

I slogged through school and found creative ways around my poor penmanship. It’s not like I didn’t love other art forms, but putting pen to paper felt dull, overly academic, and unimportant. I didn’t see how writing could have the same beauty and value as a Picasso, or express the emotions of a Rachmaninov.

In an adolescent, cool-guy way, I would take a sort of pride in “not being a writer.” Or, I’d say, “I’m good at other things — how about you write it up?” It was easier to do than admit I was bad at it. The way most children respond when they try to justify a lack of skill in some area.

That became my story. And it was left unedited for decades.

Along the way, with the advent of the computer, I thought I was saved. I was one of the lucky ones where writing by hand became obsolete in my lifetime. Good riddance. I could finally leave handwriting in the rearview.

Once I was out of school and gaining balance in the real world, I took another crack at writing. Now that writing by hand was no longer a blocker, and spelling and grammar was managed by machines, maybe I could become a writer after all.

Confronting what I now realize are years of excuses, I decided writing would no longer be a weakness in my armament of tools. It was time to revise my story. Since then, I’ve had a lot of catching up to do.

Okay, let’s try that again

In my 20s, when I started my first company, I realized the power of the written word. In order to communicate a vision at scale, one must codify their thoughts so others may follow. In order to improve, I started a blog and set out to post daily for a year. While I evolved considerably from my first post to my last, I still had a long way to go.

Years later, after hitting a plateau and going on hiatus, I decided to hire a writing coach. She swore by the power of “morning pages” laid out in the book “The Artist’s Way” by Julia Cameron. In it, the author believes one must return to the written form to connect with one’s inner artist. My new teacher passed on that requirement to me, and with it, I had come full circle. In order to learn to be a writer, I had to once again slog through my pitiful excuse of penmanship.

What surprised me about this go around was, for the first time, this teacher told me she didn’t care about how my writing looked or what it said. To her, none of that was important. She just wanted me to use my hand to write — anything. As long as paper and pencil were involved, she’d be happy.

It was — freeing.

It shut down the overly critical side of my brain, further imprisoned by early schooling.

I had a second wind.

I began to write in my notepad, about nothing, for five minutes a day. Through aches in my fingers, and in spite of all my ideas vanishing right as I picked up my pencil, I followed the prescription. I planted notebooks, pencils and sharpeners around the house, so nothing could get in the way when the compulsion to write struck. At times, when I had nothing to say, I would scribble some variation of, “I am writing this even though I can’t think of anything so that I don’t stop writing until my time is up.”

After a few weeks, I could see a connection forming between my hand and my mind. Where thoughts used to swirl around in my head and go nowhere, now they had an exit route. I developed a pavlovian reaction to search for paper when the marble in my head began to rattle. And, unlike the brevity of notes I took on my phone or computer, I found my handwritten entries getting longer at each session.

The potential was certainly there, but I still had one issue to overcome: I couldn’t read any of it.

Tech to the rescue

I’m a gadget guy. And, I’ve used my affinity toward doodads as a mental hack, tricking my mind to focus on important things I need to do that I have no interest in doing otherwise. Sure, I could vacuum and begrudgingly roll over the carpets while wishing I was doing anything else, but I prefer to get a Roomba, configure it, and whistle while it works.

“Hold on!” I thought one night, staring at my pad and pencil, mustering the strength to start yet another writing session. “Can this trick help solve my aversion to writing? If a pencil and paper is a painful reprise to teenage angst, modernizing my workbench with A.I. apps that digitize hand-written text via an iPad and Apple Pencil is a different beast entirely.”

I can get behind this.

I scoured the app store for apps that could recognize my chicken scratch, while providing the right amount of tech-nerdiness to put a spoonful of sugar into my writing regiment.

I knew I found “the one” with Nebo.

The app perfectly merged modern digital tech with old-school writing and I found myself looking forward to engaging with the experience. I went from being forced to do “morning pages” each day, to feeling like I couldn’t stop journaling, writing or editing my work. What started as a few sentences a day has now blossomed into pages. In fact, this very text is being tapped out on my iPad using my Apple Pencil while laying in bed at 11:14PM with my wife asleep beside me, and I am having trouble stopping.

Whether one considers me a writer or not is unimportant, for I have fallen in love with writing, and with it my story has finally been rewritten.

Attention Deficit or Boredom Adverse

“Back when I was a kid we had far greater attention spans!” Well, whoopy-doo for you.

Youtube, TV, Commercials, Audio books and Facebook. The list of products that drive us into a pattern of ingesting only short-blips of information goes on and on. Some believe the consequence is a loss in our ability to pay attention to any lengthy (more traditional) format, and in their mind, anything of real value.

I take issue with that belief entirely. I’d rather ask this: Why is there a requirement to be able to pay attention to long-duration formated info in the first place, and what makes that info so much more valuable? Isn’t the goal of listening, reading, or watching information to comprehend it? Where does “length” and “staying still” play into that requirement?

As a kid people thought I had attention problems. I had tons of energy and not enough places to put it all, especially during school hours. Reading one long-ass book (that I had no interest in) for a class (I didn’t care much about) was not very motivating; I perceived writing in much the same way. Needless to say, I didn’t accel in those areas much.

For me, learning was just that — learning. It wasn’t a proof of my ability to sit still and do nothing for a long period of time, or to impress a teacher. Learning was all about answering questions, digging into things that interested me, and unraveling things that confused me. When the internet became “a thing”, I found myself ingesting tons of information daily, and it allowed me to pursue those questions with ease.

Fast forward a couple decades and I’m sitting here auditing an edX class at 2x-speed. I’ve skipped over a few sections that do nothing more than set the audience up for what’s coming (e.g. Boring. I get it. Let’s move on). And you know what? I love it — I love taking classes! As for reading, In this new environment of self-paced, kindle-based, materials I’ve found myself reading more books than I ever had in highschool. Even writing has become interesting to me. I started a blog 7-years ago to become a better writer, and, over 200 posts later, I’d like to think I have improved quite a bit. With all this interest in taking classes, reading, and writing, I have to ask myself: do I have an attention problem, or am I just terribly adverse to boredom (and the old, slow-moving, teaching styles)? Which led me to ask, why the hell would anyone want to be great at being bored in the first place?!

As our technology pushes us into a new format of learning, maybe it is less about “shut up and sit still”, and more about, “here is the world — have at it!”

It’s easy to think the world is getting dumber. We see “views” on YouTube of someone getting hit in the nuts soar into the millions, and people with obnoxious (or useless) things to say use social platforms to say them at scale. It is important to remember that with or without these new mediums people have been dumb for a long time. It is also important to keep in mind that the speed of advancements in technology are increasing exponentially. Those advancements push social media, but they also cut the time it takes to roast a turkey, pop popcorn, and provide classes to people like me that can now learn more efficiently than they ever have before.

I think learning was built on an extremely inefficient foundation because we didn’t have any other way to do it. Now, we are finally trimming the fat. The problem is, our kids are now able to eat lean beef but we are insisting that they still must chew the lard first. Why?

I say, take those little bits of data, re-arrange them, pause them, and fast-forward them as you wish. Let your curiosity for answers be the guide, not a demonstration in formalities.

We aren’t losing the art of education, we are deconstructing it and reassembling it through the gift of technology. The world has started to suit everyone’s individual pace, interest, and schedule. All the lost hours of dramatic pauses, introductions, segues or fluff are gained in the hours we can instead paint, exercise — or better yet — learn something entirely different.

Sure, it may mean that listening to a 1-hour speech at work will be more difficult for a person that is used to this newer, more efficient medium — but who’s fault is that? Why the hell are people talking for an hour anyway?! Is it necessary to achieve their objective? Are we simply committed to a style of interaction in the real world for no other reason than our attachment to tradition? Are we simply not yet ready to embrace a more efficient style of information-sharing that the digital world has built from the ground up? If you are looking for art and style, maybe you should go see a play.

As we move away from requiring our audience to sit down and shut up for an extended periods of time, let’s keep our goals in mind. We are not here to prove to others that we can sit through something that does not excite us, but to find out what does. It is not to prove we can endure boredom, but to break the shackles that required us to be bored in order to learn. It is time that we agreed to fight boredom, and recognize it as an old, outdated, emotion.