So i launch a weird and unusual code realated nonsense:
this is a tidied copypasta [og is scrnsht below] of a brainspew that became long lost and hence all but forgotten. i wonder if you find it as exciting and inspiring as i did?
```
concept:
coding language
enter as unicode glyphs
uses analogue logic
copes with digi and analogue
works euro volts
some non-lins?
type boolean function as glyph string, enter numbers for initial conditions? plus truth table.
out is approx?
e.g.:
to design serge vcs
trig -> s latch -> slew -> out
^ -> r |> cv
| 0v+ -<<<<<<<<v>>>>>+ -8v
| comp | bipol comp
----------|-----><- mix
whats point.
forget.
```
ignoring the adorable 'what's the point - forget it.' at the end, there's quite clearly a gem of an idea unhewn within that mumble-jumble. or two, or three!
First of all there's the nice lil concept of replacing math functions such as boolean operations with single characters on the keyboard:
- saves time/hassle/strain by less typing er code actions requiring mass code reading fast.
There's also the concept of making the ability to calculate things employing fuzzy logic [analog logic] as would an analog computer, and then further on is mentioned handling both digital/binary logic _and_ analog logic together.
- analog computers, having instead of just `0` or `1` as their degrees of freedom but `0` to `5`, `10`, `100` or even further can perform certain calculations incredibly fast, sometimes even outperforming their digital binary cousins - just see how long big banks of valves were calculating ballistics trajectories for the US military!
- analog computing done by an electronic/solid s programs/instruction set ate machine deep-coded with the greater degrees of counting freedom in it's very OS and machine code/instruction set could prove to have a great taste of the advantages without requiring unreliable high maintennance tubes and immense long reprogram/reconfig times.
- a computer employing a hybrid of both analog and digital/binary logic could be unique enough to be worth actually developing...
Thirdly we have the concept of using unicode glyphs not just as shorthand, but as actual code language, taking preprogrammed initial conditions to produce truth tables for various possible outcomes depending on variables entered and so opening opportunity for mass pre-calculation/compression/acceleration, along with enhanced code parsing, assembly, simplification and actual running.
- with careful planning it should be possible to be highly efficient in initializing memory such that most commonly required results are not only present but at the forefront of access.
Although a great deal of careful pre-planning would be required to make the most of the advantages discussed so far, at the end of the document we get one last treat:
an ascii sketch of the internal workings of a serge VCS flowchart style. admittedly, i do accept it's not written explicitly, but by this point my whole chain of thought is becomming refreshed - why not set unicode glyphs to code for entire analogue components such as operators, functions, variables, constants and whatever else.
It wouldn't just be a fun codey version of VCVRack, this would be akin to having the pytorch ML library but for analog computing.
Simply string together the required functions and slopes and comparators etc. that code for the math you're solving and what formerly took a coupla weeks of patching/punching cards/etc. now requires typing a single string of keyboard presses! cool!
#####
Ofc, you're all screaming 'why aren't you doing anything with these ideas?'. welp, initially i sat down to choose a set of obscure Chinese and Japanese Kanji etc. as symbols to represent operations needing a whole mess o' keystrokes to enter.
This mutated across the course of a long weekend into quite a cool lil parser to read the input strings of symbols and encode them as math; i then progressed to a modular library system involving loading advanced calculus, statistics or matrix transform modules as required to engage the set of operations necessary.
Naturally this mutated further and forked as it did so:
- after getting curious as to how well the library/parser oupling was working, i encoded my special woflnn setup for it and fed it in to see if it would run - and heavens-to-betsy it dang well did! first time. :O ofc naturally i proceeded to spend a while optimizing specifically for my nn's brain features like the cross-talking ganglion and the cellular automaton based communication transfer layer and eventually was so deep in woflang encoded CAs etc. i had to really fight to get back on course.
- otoh, i also, in parallel, took the whole idea back to being about efficiency of math entry/calculation and started really working at getting the code entered to best suit the parser such that the pairing made a whole entire new code language specifically optimized for simple/easy/minimal input effort but very fast compile/runtimes.
unfortunately, as usual, that's p much all she wrote - other than half assed ventures in making C++ and assembly versions attempting to further boost it but alas my short attention span struck and the whole lot has, fot the most part, languished on my git as `woflang` ever since...