Use AI to Accelerate Software Development

People think that AI will accelerate software development by generating code. Coding is only a tiny part of the story. So much of software development precedes code, and these activities are usually the blockers that impede technology acceleration.

Before any code can be written, there is the question of what needs to be built. No work can begin until there is a purpose. Work requires time and resources, which necessitates investment. An investment needs a business case. This means getting to know what customers (users) want, what they are willing to pay for, and whether it would be worth building. You are not an AI accelerated software developer, if you have no purpose, if you don’t know what your customers want, or if you can’t justify the investment.

Traditionally, coders working in a commercial setting have relied on product managers, business analysts, and executives to acquire such knowledge and make such decisions. Usually the coders are not subject matter experts. Coders are always asking “what is the requirement?” Coders are paralyzed waiting for answers.

They don’t know the problem domain (the customer’s business) and the terms of art. They don’t have a mental model of the problem space. They don’t know what the actors do (journeys, use cases) and how they collaborate to accomplish their business objectives. They don’t know how these actors do their work, how they need to see their information, and what tasks they perform in certain contexts. They don’t know scale (how many users, how many transactions per hour, how much information of each type). They don’t know cost. They don’t know availability. They don’t know regulatory compliance. They don’t know their customer’s products, pricing, and business policies. There is so much they don’t know about the world, because all they know is how to code.

Coding is always blocked waiting for these responsibilities to be fulfilled by roles who don’t code. The decision makers don’t know the technical side of how to code. The coders don’t know the business side of why or what to code. The collaboration is always impaired by this mismatch in skills and knowledge.

For AI to accelerate software development, the business-focused roles must be elided with the technically-focused role. AI tools must be built to fill the skills and knowledge gap for whomever is in the driver’s seat. Human jobs need to be lost and replaced by AI. Not clear is whether coders will learn to use AI to drive the business; or whether product managers will learn to use AI to code. My bet is on the latter, even though my sympathies and bias are with the former. I don’t see AI gaining the necessary competencies to become entrepreneurial, visionary, and business savvy.

On the other hand, very few people in the business-focused roles are any good at it. We need only look at the lack of success that Tim Cook, CEO of Apple, is having in introducing innovative new products relative to what Apple has done in the past under the leadership of Steve Jobs, a true visionary and a genius in designing desirable products. All lower-ranked roles are responsible for a lesser subset of Tim Cook’s decision-making. It is a rare gem who can achieve 1% the success of a Steve Jobs, when it comes to product vision and design. Coders who have this kind of talent would have become founders. Perhaps, AI can help accelerate this more for coders who have such an inclination. By the sounds of how Tom Bilyeu applies AI for exploring business ideas, this is not as far fetched as I imagine (because of my own ignorance in applying AI in this way).

In the near-term, the quality of vibe code generated by AI is hit-or-miss. A great deal of human supervision is needed to make it work. With the progress we are seeing in AI advances and the cadence accelerating, we should not dismiss the possibility that vibe coding without a heavy human burden for supervision will become routine. Since this is a goal that many people are pursuing, this outcome is pretty much guaranteed. There is virtually no chance it will NOT happen.

Until recently, I did not think that AI would have good taste and judgement to produce good designs. Then, I was introduced to Cline. It is a coding assistant. You guide it by writing a project brief, which documents your project’s purpose, structure, standards, guidelines, constraints, technology selections, style preferences, and rules. You write these in exactly the manner that you normally would in concise English for human team mates. Amazingly, Cline understands and complies perfectly. Through this experience, I am now confident that an AI can be guided to do good design by documenting design principles, archetypes, patterns, and trade-offs. AI may not be there yet today. There is virtually no chance it will NOT happen.

Applied Cosmology: Dimensions and Degrees of Freedom

Today’s lesson in Applied Cosmology: dimensions and degrees of freedom In physics, Minkowski spacetime has 4 dimensions (3 spatial dimensions and 1 time dimension), expressed as X⁴. In curved spacetime, the number of degrees of freedom for X⁴ is the number of parameters to specify this model: fourteen (14).

  • 4 coordinates for identifying a point in spacetime (x, y, z, t)
  • 3 rulers in the space dimensions to measure distance
  • 1 clock in the time dimension to measure duration
  • 3 protractors (x-y, y-z, z-x) to measure the angle of each space dimension with respect to the other space dimensions
  • 3 protractors (x-t, y-t, z-t) to measure the angle of each space dimension with respect to the time dimension

In the same way, we wish to identify configuration dimensions with respect to separation of concerns. The number of separate concerns is the number of dimensions.

Within each dimension, each concern has many parameters. The number of dimensions is modest (fewer than a dozen?), while the total number of degrees of freedom (parameters) is large (hundreds?).

For example, the horizontal scaling dimension is parameterized for the platform and infrastructure by the number of worker nodes in a cluster. Within an application component (e.g., deployment or statefulset), horizontal scaling is parameterized by the replicaset scale.

The vertical scaling dimension is parameterized for the platform and infrastructure by the compute shape (cpu architecture, cpu, memory, boot volume) of each worker node. Within an application component, vertical scaling is parameterized by the cpu, memory, and storage requests and limits of each container within the pod template. Other dimensions of interest are:

  • high availability
  • disaster recovery
  • workload complexity
  • workload scale
  • workload isolation
  • security isolation (and many more)

Renewables to Gas Generation

We should look into diverting all solar and wind generation to storing energy as covalent bonds between hydrogen and carbon or nitrogen, as non-fossil fuels that can then be applied toward reliable power generation later in time.

We could integrate a micro reactor to implement the Sabatier reaction (CO₂ + 4H₂ → CH₄ + 2H₂O) into every solar and wind farm, disconnecting its output from the grid. Pair it with a miniaturized combined cycle natural gas peaking generation plant (connected to the grid) to consume that methane as needed. This becomes a closed system, where the CO₂ is recaptured and recycled into fuel.

Renewables already need natural gas generation as backup. If we desire to shift more generation to unreliable and intermittent solar and wind, we can only achieve reliability through power storage. Battery technology is too expensive and not scalable to grid scale overnight power production. This concept reshapes how renewables work in tandem with gas generation with the addition of a reactor for methanization as well as emissions capture to provide a closed fuel cycle.

AI will not eliminate coding

People who don’t code are bound to think that AI will eliminate coding jobs. Coding in a programming language is expressing a machine precise specification for how to do something. Asking an AI to output that expression is an act of programming, such that the prompt(s) need to capture a near machine-precise specification of what needs to be done. That is coding, except in natural language.

A great deal of software engineering is requirements analysis and design, which is understanding the problem space and the solution approach, so that it can be captured in natural language and diagrams. These are necessary today for many people (management, document writers, support staff, marketing staff, sales staff, customers, end users, and fellow coders) to gain a common understanding and execute a plan as a cohesive team. Capturing a vision, goals, a plan, a design, and all of the nitty gritty details to fully specify the user experience are just coding, even if expressed for an audience that is human. Indeed, even when AI takes over the full responsibilities of generating the machine code, we still need coders to do all of that. Perhaps the burden can be eased to a degree, as toil and drudgery are automated further by AI (i.e., generating designs based on archetypes and patterns when prompted by compact terms of art known to coding experts). This is still coding.

Ask a non-coder to write out the precise steps in English for how to tie one’s shoes. How many people can do so correctly enough to teach a naive robot to accomplish that task? Surprisingly few. That’s coding. That skill does not become obsolete because of AI.

Road to Decentralization

The road to decentralization is long and hard. Let’s map out the way to get there. The journey will necessitate some key innovations.

Today’s Big Tech platforms exert centralized control over the service, the protocol, and the software mechanisms for our applications. These are vulnerable to government coercion and force to deprive us of rights, such as through censorship, denial of service, embargos, or deprivation of rights. A truly decentralized architecture could not be coerced or forced to obey any law or authority. Code is law, and code can have a life of its own once unleashed so that even if you imprison or kill some coders, you could not control what people choose to run.

Similarly, markets are free. No amount of government regulations and laws can deny the truth that the more tyrannical control is exerted over legal markets (such as with restrictions or bans), the greater such markets fall outside of the government’s control, as demand will motivate supply to shift to black markets.

Cryptography

The fight for freedom requires many technologies. Chief among them is cryptography from which the “code is free speech” mantra was born to protect cryptographic algorithms from being controlled by government as munitions. Internet commerce is built on this foundation.

Decentralization

However, decentralization is not yet at its infancy. The Internet has evolved in the direction of centralization (Big Tech service providers), which positions platforms to become tools of tyranny. We have not yet built a sufficient system of technologies to decentralize the applications we are accustomed to.

Money

We are beginning to see Bitcoin as sound censorship-resistant money evolve toward being usable. It has a ways to go. The hurdles will be immense, as the goal is necessarily to burn the current system to the ground. We know it. They know it. Resistance is futile, as the system is burning itself without regard for BTC. But the incumbents will hang on in desperation until the bitter end.

Identity

We are confused by digital identity. Having a technology for identity is essential, but we know a government-imposed technology would be dangerous. Bitcoin has one of its own. We need a generalized mechanism so you can own your data and your access to your application services. This depends on having some mechanism to identify you. However, we must not adopt any tech that entrusts the government. Central control (by government or a corporation) over identity would make you vulnerable to being unpersoned by that authority. We need Self-Sovereign Identity (SSI), where the person owns their own credentials (create and hold your own private key). Prolific SSI technology does not exist yet. Some point to a Nostr nsec/npub pair with hope.

Social Network

Having identity we can now form relationships. Connections between people enable social interactions. You need to be able to take your connections with you, so your social network is not held hostage by any platform.

Self-Sovereign Data

Your identity enables ownership over your data. This includes access control, privacy, and integrity. You need to be able to store your data securely (encrypted with your key, replicated), so that it is broadly accessible by all your applications (in the cloud). Everything associated with you on every application should be considered your own, including your profile, settings, preferences, social network, authorizations, and your application data (e.g., content, documents). No one should be able to rug-pull you or hold your data hostage.

Web of Trust

With a social network that we can take with us to any application without worrying about rug-pulls, we can rely more heavily on it. If everyone carefully curates their connections for credibility and reputation, we now have a Web of Trust. This is useful for calculating how trustworthy another person is based on intermediate relationships. This also gives us a good mechanism for distinguishing legitimate content from spam.

Unstoppable Services

There will be many more protocols (especially to enable peer-to-peer), platform capabilities (i.e., higher level virtual machines that span infrastructure providers), and architectural patterns that need to be invented to enable application components to become unstoppable by state and other malicious actors. Everything needs to be built to be resistant to censorship, denial of service, and deprivation of rights. Users must have an exit to take their business elsewhere.

Alternatives should be ubiquitous. Protocols and standard interfaces should enable uniformity so that platforms can be commoditized. That way if one provider does not live up to their promises, users can take their identity and data to a competitor or to a platform for self-hosting. Open source applications are preferred for self-hosting, but it’s foreseeable that as decentralization becomes the norm, commercial services will enable self-hosting as well. This reverses the shift to SaaS, but increases demand for IaaS and PaaS for self-hosting.

We have much work to do to manifest this vision. We are years away. Perhaps the time line we should expect is somewhat aligned to the burning down of the fiat world. The rise of decentralization of money should be accompanied by the decentralization of everything money can buy.

Consumer Regulation is Most Powerful

I believe very much in the most powerful and incorruptible regulatory regime possible for all business affairs, including AI. That regime called consumer regulation puts the power to enrich or destroy any business directly in the hands of the customers.

Individuals free to think and act according to their own reason and assessments will course-correct quickly and effectively. The freedom to buy and sell voluntarily are the most powerful and ultimately incorruptible regulatory force possible.

Customers are decentralized sovereign individuals, who can evaluate evidence independently. Each directly benefits from their own good judgement, or is harmed by poor decisions. This motivates quick error correction and optimization based on evidence. Knowledge spreads socially.

Customer assessments of benefit are immediately signalled in their buying decisions, the prices they are willing to bear, and their recommendations. Likewise, assessments of harm impair demand and reputation, which deny revenue to the bad supplier.

Contrast government regulation. Does not benefit from being right. Incurs no cost for being wrong. Has no liability for harm done to others for its poor decisions. Centralized control is corruptible, politicized, and often captured. Slow to react to new evidence or admit error.

Government power is limited to coercion, force, and penalties. Powerless against those who do not recognize its authority, such as criminals and black markets. Unlike the free market, where customers are equally powerful in all situations to buy and sell at will.

Conclusion: we must not usurp the most powerful regulatory power (customers) by delegating regulatory authority to government, which is weak, feckless, corruptible, dishonest, unaccountable, and a vulnerable regime that can be captured.

Thanks, @Mark_J_Perry, for this article which coins the term consumer-regulation.

Libertarians: Move to Somalia

The “move to Somalia” trope shows a deep misunderstanding of what is required to bring about a good and orderly society. [See original tweet thread]

https://twitter.com/RkPhxQuatro/status/1714390232735629425?s=20

Whether society is mired in chaos, violence, and crime or thriving in peace and prosperity depends entirely on the morality of its individuals. It is not a function of government or lack thereof. One need only look at urban areas in the US with the most stringent law enforcement to see the worst crime. Whereas, rural areas with the least enforcement are the most crime-free. The difference is in the principled self-regulation of behavior of the individuals.

Whether society has oppressive government rules and enforcement or freedom only affects whether optimal conditions are provided for morally-behaved individuals (already at peace and refraining from violence and crime) to flourish and thrive. Liberty enables human flourishing. However, no degree of heavy-handed government force can subdue a chaotic, violent, and criminal population. Indeed that is what Somalia demonstrates.

Government Big Tech Censorship Prohibited

Today, on July 4, Federal Judge Terry Doughty finds the government likely violated the First Amendment by conspiring with Big Tech in a “far-reaching and widespread censorship campaign.” The judge grants a preliminary injunction blocking the DOJ, FBI, and DHS from working with technology companies to censor content. The “Ministry of Truth” style censorship is now prohibited.

[Updated July 5, 2023]

[Updated July 6, 2023]

[Updated July 7, 2023]

[Updated July 10, 2023]

[updated July 14, 2023]

Neutron Stars and Black Holes

I was reading this article on the behavior of neutron stars and black holes being very similar with regard to accretion. This reminded me of a hypothesis I’ve had for a very long time. I will state it below.

Hypothesis

My hypothesis is that the compact object at the interior of the black hole is the same stuff as a high mass neutron star, whether you call that a quark star or whatever. The only difference is that the event horizon grows to engulf the entire object. This hypothesis may also exclude the existence of a singularity, although I’m not confident about that, as I will also explain.

Neutron Stars Grow Into Black Holes

When a neutron star accretes matter gradually, eventually there is enough mass to become a black hole. At that critical moment, is there a violent event? A non-violent event would suggest the interior of the black hole is unchanged from the neutron star that was its progenitor.

If the gradual accretion scenario is non-violent, it suggests there is no state change in the compact object itself. It is only the curvature of spacetime that changes, as the Swartzschild radius grows to engulf the entire object. A non-event suggests the nature of the compact object is unchanged. There is no release of energy.

Black Hole Singularity

The Swartzschild radius > 0 for a neutron star. Its interior within that radius already is subject to black hole spacetime curvatures without evidence of a singularity. Therefore, I disbelieve a singularity exists at all.

This also supports the idea that black holes and neutron stars are made of the same stuff. The growth of the Swartzschild radius is gradual with accretion. There isn’t a continuous series of violent events as the neutron star gains mass. Or is there? Maybe that explains the violent outbursts observed from neutron stars. This is known as a starquake. The neutron star compacting its material into a lower energy state. They think it is a surface phenomenon. I’m not so sure.

On the topic of a singularity, if the Swartzschild radius is already > 0, that means the innermost core of a neutron star is already a black hole. Is there any evidence of a singularity? What would such evidence look like to an outside observer? I read somewhere that the singularity is not necessarily a point in space, but can be thought of as a point in time. My mind struggles to imagine what that would mean. Given information cannot escape the event horizon except as Hawking radiation, we shouldn’t be able to observe any evidence of what is inside. Therefore, we cannot be certain.

Have I talked myself out of my hypothesis? No. If the final transition from neutron star to black hole is non-violent, I think it supports my claim. My hypothesis is also consistent with the current accepted explanation of starquake as a surface phenomenon. However, if a starquake results from gradual growth of the Swartzschild radius within a neutron star, I think that would falsify my claim. High energy events from the interior would suggest that there is a change in state of the stuff inside the neutron star.

Insights into innovation