Theory

_images/unicon.png

Index Unicon

Computer programming theory

Todo

only random thoughts so far, goal is codification

Unicon and Computer Science

Unicon is very well placed to be put to use advancing computational theory, computer science, software engineering, and other critical areas of digital construction.

Everything from simple Hello, world examples to complex graphing, execution monitoring and visualization, and untold as yet uncodified digital computing topics.

Below software development, in the theories surrounding result oriented computer programming.

Proving Unicon

Not easy, in the large, as there are building blocks and notations to create, and within these smaller pieces might be facts that can be proven.

seq()\1

How hard will it be to prove that that code will generate a one, and then be forced to fail by a limit expression?

Part of the “how hard will it be”, is figuring out the level of detail involved in that expression. There are many. The C source code that writes the compiler, that compiles the Unicon, that runs a virtual machine that evaluates expressions. Running on a machine with effectively infinite combinations of electronic circuit state. It won’t be easy.

Perhaps it is simply the lack of a viable notation. The Roman’s, having recently conquered the Normans, could not tally the spoils of war. Despite having a great interest in doing so, and putting some of the best and brightest of the time on the task. The problems with the Roman numeral system did not allow for reckoning the sums. This type of math is easily solved by school children now that the notations used in arithmetic have changed.

Maybe a Unicon programmer will see the light that brings a quantum leap in computer science notation and level of mass understanding. Of all the programming languages, and design influences, maybe Ralph Griswold was on to something, and the situation just needs a little bit of extra insight to open up a new level of possibilities.

At the moment, correct software development is hard. Maybe we are in a phase of Roman numeral notation when it comes to source code and the human machine interface.

Or maybe we should step back to \mathcal{P}^{\prime\prime} by Corrado Böhm, and start smaller? https://en.wikipedia.org/wiki/Corrado_B%C3%B6hm. Corrado describes a Turing complete theoretical system with 4 symbols, 4 rules of syntax, and 5 semantic clauses. https://en.wikipedia.org/wiki/P%E2%80%B2%E2%80%B2. Maybe more people should study the simple things, in hopes of making the step that replaces current models with a more capable system.

Either way, forward or backwards, I’ll opine that Unicon can play a role in the evolution of provable digital construction and the as yet undiscovered notations that may get us there.


f = ma

Force equals mass times acceleration (velocity squared). f=mv^2. Energy equals mass times the fastest thing squared. e=mc^2.

Where is the

Result equals mass (lines of code, concrete instructions) times electrical impulse (squared?) theory/law? Result = instructions times pulses squared, r=ip^2?? I’d like to believe that Unicon can help narrow that down (if such a principle or relation actually exists).

Physics comes with massive parallelism. All points of gravity are self powered in the n-body equations, and each exerts an influence on the system independently. Unicon concurrent co-expressions kinda fit that bill. Discrete “points of gravity” able to influence the state of the system independently. Maybe.


Index | Previous: Multilanguage | Next: RosettaCode