Close
Notification:  
v2.2.1 Professional
Login
Loading
Wiki About this wiki Volume 1 Volume 2 Test Page Vol 1 - v053 - Errata Vol 1 - v053 - Aux Book Features Vol 1 - v053 - Alternative Format Vol 1 - v054 - Notes Vol 1 - v054 - Pages Vol 1 - v054 - New Paragraph Sources Vol 1 - v055 - The Delivery Method Vol 1 - v055 Notes Vol 1 - v056 Notes Vol 1 - v057 Notes Fundamental Images test -paste in table Test Buttons New Page New Page Where is the Password for Additional Features? Next Word Version FR Issues with TOC and Book Interleaving Dynamic Text Display Everywhere Very Large Books Book content mapping New Behavior Evolutionary News Microsoft Courier Literate Programming Currently Reading We're going to Mars - Mission to Mars 2 New Big Book Links Circular library Wiki Distribution The Mind's You Preservation in the Digital Age - REPRINT Introduction If Words were Flowers Foreword and | or Preface HyperTextopia and the Docuverse Chronology Time Quantum Self Reference print paragraphs of text in pseudo KANJI - Paul Haeberli - 1996 Hypertext that works Les Sous-Sols du Revolu Napoleon romance novel finally released Books and architecture The Archivist - Schuiten -- Peeters Authoring Bots More Book Stats Non-Ownership Collaborative Writing Literary Evolution and the Russian Formalists New Printing Surfaces Failed Time Capsule Methods Toilet Paper Novels Bed Cover Non-Fiction Texting Jargon Finding books in other books with x-rays Data in Motion is Safer Data Rosetta disk Calendar Based Update What we can learn from slow music Media that last for ever Plastic Logic E Books Future or Libraries by Thomas Frey This Book's Seven Wonders Oreilly Montly Subscription Book Borrowed for the Longest time v055 stats Count how many dragees results - January 1 Jen and William's Annual Hangover Brunch- Experiment Results My Name is Zachary, I am 21 and I am hot 10 Literary Exploits - Commented The Tyranny of Gadgets RSVP techniques Book Pricing Algorithn New Links Political Parametrics - 2d to 3d conversion of the American Political Landscape TOPANGA to DOWNTOWN LA - Good Books Graze, Hunt and Browse Expedition Typing without a keyboard Computing_Timeline Software Cracking for the Mass by Google, inc. Fixes for Multi-Level Moving-Image Semantics Chalkbot Hardware Accelerated Bible Code extreme poetry New Page New Page New Page New Page Interview with a chatbot - (c) New Scientist anthropomorphic middle 'man' Reinterpreting Mount Rushmore Books that became algorithms Reading old stones Norsam Technology 219 Years of bets at Cambridge Long Term Backup strategies Recovering Mesopotanian Tablets Carlos Ruiz - Book Cemetary Flexible OLED Foldable displays - what happened to Readius Copyright law issues that inline linking raises Deep Linking - Printing the internet with the Google clause New Page Math Tables keyword reading scheme - teaching reading Best Man Speech Flowchart comments New Page

If Words were Flowers  - 051509

What is L-MEM? -- This Book is not a Garden

L-MEM is what I work on / do at 4:00 am on Sunday while everyone else is sleeping, every week of every month of every year for at least 25 years now.  It’s a mixture of writing this book and thinking about something that the subject of this book, the Book, is my tool in my attempts to understand.  It first starts with something like this Page and it acquires over time more and more precision.
 
Even if this project is not in it’s final form yet. I think that over time, the book has become a pretty unique collection of ideas, and the printed version is a sharable form. The book has now become dense and large enough to sufficiently project the idea that the intellectual methods used to assemble it make sense and can produce desirable results.  
 
Yet, when I try to explain what this project is about I realize that I need a good example that would be similar to this Book project, yet would share none of the content, while being conceptually totally related, part-of the same cultural mentality. Without that I fall into what Hofstadter would describe as a strange loop.
 
So the hypothesis of this Page is that with a proper analogue we can better define what is not this Book project yet is an L-MEM, as we are then forced to better define the "not-a". For me an L-MEM is a set of sharable logical units from which emerges a form rational and asthetically pleasing results that are larger then our comprehension unaided by various enhancers be they mechanical, artificial intelligence or continuous sensors. 
 
L-MEM is basically a thinking method (a way to form and maintain a certain type of thinking habits) that “I” in Vol 1, try to define (with the help of humanity until now) around the theme of organizing logical paragraphs into very large books. L-MEM Vol 1 is concerned by words. We here try to define what a flower L-MEM would be, thinking it might allow us to create a super-self reference of the Book. That is the pleasure of a flower is so distinct from the pleasure of a word that to explain this project as a gardening problem might be a better way to explain what this Book is all about.
 
So, in this logical Page we will describe what a Flower Based Society could be. We will use the same thought pattern that we use to address the idea of authoring Books larger then our capacity to read, yet incredibly consistent and adaptive to remain interesting as compilations that can act as source of never-ending pleasures.
 
If this Paragraph was a Flower…
This project focuses on defining 3 basic semantic units we call in the Book project the logical Paragraph, the logical Page and the logical Volume.  The term logical is used here to remove any form of quantitative measure or actual physical contiguity out of these units.  
 
A logical volume is a slice of knowledge that excludes everything else. To be more clear “else” in this case also includes a lower-strata which includes a lexicon and general encyclopedic entries, as well as library like topical indexing, common knowledge.  A dictionary is made of words defined by other words (is thus analogic) and as a whole for that dictionary each word is defined by other words in that dictionary. 
 
For example maybe there is 1000 different flower types on planet Earth (that is a flower level that is maybe like a library indexing system, Dewey has 1000 top entries. Each of these flower might have a distinct encyclopedic entry: a Page that describes things such as climate where it grows, what it looks like, when the plant produces flowers (every 7 years, every 7 weeks…). And there is a set of all flowers we wish to consider distinct (rose varieties…). An L-MEM specialized in large-scale distributed gardening would need to assume such common knowledge base.  Exclusion thus operates by not being part of the common strata of common knowledge and related to what we progressively refine as a point-of-view on the docuverse (the set of all possible documents in the universe). Thus our personal large scale gardening project assumes that something like an orchid flower is defined somewhere and our garden is large enough to be unique.
 
Farming Flowers as a Field of Expertise
For our purpose, an L-MEM logical volume it’s pretty much the same thing as developing a field of expertise. An expert here is someone who can accomplish a task that only a few others can (say less then 2% of the population). Such expert to acquire that expertise probably “mastered” over 50 000 “special” concepts (the quantity is not important, but such ‘concept’ is pretty much what a logical paragraph is here, the set of words under the title "Farming Flowers as a Field of Expertise"). In practice, say you are a flower farmer, to be successful you need to know not just about flowers, but about agriculture techniques, have some business sense…  and some of that required knowledge is local (climate, politics, cultural events that might affect the business). So a Garden is particular to one particular context. There are other flower farmers, there are other farmers, there are other businesses, etc but there is only one Garden (with an uppercase).
 
Over time how you visualize your field of expertise will change. Your “special expertise” will become something else, will branch into once was a smaller part of your personal expertise assembly. You might become less interested in the technology of fertilizers and tractors and more in the organization of local politics with regards to government agricultural subsidies. Your "Garden" will change in terms of purpose and content.
 
You might also in life have an interest in go-kart racing, yet go-kart shares so little with flower farming that it cannot be part of the same logical volume. So a low-level L-MEM procedure is that any new information "coming in" can sort of be at an higher-level filtered out first by: “Is this related to this Field of Expertise/Interest?”.   If the core element of an L-MEM is a flower, the only way go-karts can be part-of is via an indirection as a case study. For example, "selling flowers at an event".  Then because you know all about go-karts then and only then it becomes a clear connection,  as the explanation often works much better with a concrete set of organized experiences, that is an abstract concept also needs to be explained with concrete normal life experience to both acquire more abstract purity, and facilitate definition of the concept.
 
By defining your project as an expertise, then when exposed to an idea a little light automatically pops up – e.g. “nice one!”. Now, usually that idea is locked inside some other construct. To search and paste is usually not sufficient, the creative process is to contextualize a set of ideas (to transform raw data into something usable).  Once extracted to be properly integrated it needs to be put somewhere. Logical Pages are constructions that locates a number of logical paragraphs together. Functionally they allow logical paragraphs to self-delimit themselves by acknowledgement of the same being said elsewhere, within that defined domain with other words. Logical Pages do overgrow in time and then can be split in two. 
 
How to Grow a Garden with 1 000 000 Different Flowers
So a related Book to L-MEM Vol 1 could be “How to grow a garden with 1 000 000 different flowers”. By different I mean a set of all differently named flowers. This is a substantially different project then how to farm over 1 000 000 units of the same flower. 
 
If we were trying to make a 1 000 000 different flower gardens, we might start to think in terms of special climates, for example a stretch in Northen Arizona where within 100 miles you traverse many different climates. You would need to develop a business model to finance and sustain the operation, perhaps it’s like a botanical garden with a pay-for admission. To think about the problem you might develop the idea of logical vase or plant pot. You would start to place some flower types in logical chunks of land, probably because they share a lot. You would imagine 1 000 000 people on earth connected into a single domain, each by managing their single flower into that domain contributing to 1 000 000 flowers gardens. You would soon realize that from a garden publishing frame of reference it would be more optimal overall for such Flower Society to adopt a semantic trading unit based on a logical patch of ground that collects a particular set. Society might stabilize around the idea that 1000 logical gardens is a very useful organizational structure, each made of 1000 logical vases. Maybe one would conclude that to create the 1 000 000 flowers garden we should elevate the overall result to be defined as a floral botanical park, so within the context define 3 basic corresponding semantic units as Vase, Garden and Park.
 
Flower-Based Economy

Upon arriving at such decomposition, you would then entertain distributed gardens. For example visiting 1 garden a week at the right time for 20 years might make you look at each of 1 000 000 flowers individually in person. In order to make it possible to collect a photographic memory of that action of looking at, you might develop software that allows you to tag a particular flower (or many) out of many directly when shooting and automatically that gets uploaded to your virtual representation of your garden. You might decide to sustain 1000 different flowers garden, or 100 from your house with 10 others elsewhere... so have part of your garden whose flowers existed at one point in time and space and others who are continuously replaced. You might want to geocache the location of one of your Vase reference on a virtual GUI representation of your botanical park, allowing you drive on the street and flower-mark a site. 

 

There might be a weight system providing more value to a living flower connection based on residence distance to that flower-mark, providing regional incentives for floral diversification maintenance. There might be another weight value associated to how many people link to a single flower somewhere, with additional metadata such as when was the last time that it was blooming (the logical vase not a particular physical flower). By accumulating these grading systems we eventually form structures from which the Flower Economy can arise. By counting the number of gardeners who link to your flower to complete their botanical park (perhaps they live in Yukon where growing agave flowers is hard), you acquire a form of popularity measure which applies pressure on the overall system to change, which defines a value for your effort.

 
Try to Write in a Manner which is Future-Proof
Our project is more about paragraphs then flowers, about writing rather then gardening. L-MEM vol 13 might be based on some flowers based units. Instead of print this Book, maybe it's plant this Garden. Stylewise aside having clear filtering methods to exclude what is not related,  there is also a continuous effort to try to ignore current techno-buzzwords and naming things by brands. One action taken in the last 2 versions is to increasingly start with text samples from older texts playing with a similar or related idea. Of course this process is greatly augmented by the availability of digitized archives. 
 
I noticed that aside correcting my poor earlier English writing, that a lot of the edits of my older text is either removing references of the day (e.g. name of a particular software) or tagging the example with an approximate date stamp. In order to help me write like that I try more and more to define the ideal reader as someone who will live in 6000 years and to help that proto-reader understand/interpret me I also try to define my location in time within an historical continuum. If this was about gardening. Gregor Mendel genetics would be part of the ecosystem of ideas my Gardening project would relate to, would try to include without fully including Mendel whole body of work. It would not simply cut and paste Mendel's work, it would include it perhaps to help define the Garden theory, which form of hybrids make sense.  Perhaps we would engineer apple trees with different branches to create at once with a single tree 10 different "flowers".
 
Avoiding Over-Connectivity

Over connected systems are fragile. The most successful knwoledge systems work by exclusion and exclusion is multi-dimensional including evolutionary.  Common wisdom would say that partial knowledge is dangerous. Once we assume that we have access to all texts ever written in a digital form, external references often act as exclusion rather then validation.  This is related to that by this,  but "that" is not a subset of "This".  For example I look at the list of books that the book "I am a Strange Loop" by Douglas Hofstadter defines in it's introduction as being influenced by and I notice some overlap with my own list of what has influenced "This" book. Somehow where I come from is turning analysis into synthesis. The presence of such clear semantic influence fields overlaps tells us there is potential for deeper interpolation between their respective point-of-views on the docuverse.  In very abstract terms this means that if you took all Hofstadter's writings and reformatted/binded them into one volume called L-MEM Vol 5: Self-Referential Systems what sort of analysis would have generative (creative) values?  On a low-level it's perhaps easier to conceive using lexical analysis  to morph if you like an in-between lexicon, a sentence construction style that ends up part of each stylewise. This part is similar to ask an author to copy edit another author and to rewrite itself using only the words used in the other Book.

 

Book Morphing

First let's define exactly with an analogy what we are trying to define here.  You could have a visual effects that transforms a face into another -- morphing. At 50% you get halfway between one and another. You can also have other type of transformations, you can visualize yourself after e.g. surgery, a different kind of morphing.  You can also try to visualize what a baby would look like given two parents pictures.  You can also try to sum average a set of faces to create an average person.

 

In all cases to come back to writing this could be defined first at a lexicon level.  A particular Book does not use all the words in the commons lexicon, yet might make up some.  You have played with a word processor which asks you when you spell check if it does not know a word if you want to add it in your lexicon.  So the first thing a Book has is a lexicon of all the words it uses. Some of these words are basic words:  to, in, all, in, your, I,...  One such collection is known as Ogden's Basic English, a collection of 850 words sometimes used as the basic vocabulary in learning a foreign language.  So over such basic english there is another set of words not as commonly used. In this Page this could be lexicon, construction, archives...  as well as some proper names such as Mendel, Hofstadter etc which refer to a particular person rather then all who have that name.  Also of course this also implies a particular semantic exclusion for the way certain words are used, for example "construction" here is certainly not used in the sense of "the occupation or industry of building".  So if a Page would be lexically analyzed so certain words are tagged as basics and the rest became a list that one can tie to a particularly dictionary which would list 1 to N definitions and some checked out.  And you wanted to "morph" that Page so it fits another Book Page, then a first degree would be that you have 2 lexicons that don't overlap. One can imagine using color coding to identify words without lexicon match. Similarly one can measure different syntactical constructs such as numbers of words used in average, popular sentential constructions and further generate something that ressemble the writing style of someone.

 

For the vocabulary level, the exercise ressemble then from a Book standpoint to rewrite the other Book Page with only the words from our lexicon (so is a form of degenerate translation). So we probably have the following cases:

1) Morph someone else Page into our Book (so use our lexicon to "fix" the Page so it fits)

2) Morph our Page into someone else lexicon (so fit our Page into someone else world)

3) In-Between a Page from both Books using only the subset or the superset of their respective lexicon.

- perhaps to create a Page that can fit in both Books

4) Create a Page outside of both Books (a different POV) using exclusively another lexicon

 

Of course this level only focuses on "texture" morphing, it completely obliviates what is respectively said. One could argue against abortion and the other for. Even if a common lexicon is derived, the text morphed, it does not mean the actual opinions are so reconciled.