Technology related projects

Techdemo, 2012. 3D printing, 2012 Graph project, 2011 Structure from Motion, 2009,2007 Geodesics, 2009,2008 Flash PITF, 2004 Sofia AI project 2003/2004 CCP 2001 I,II

Technology notes 2017-2018, 2007-2010, 2011-2015, 2016, Now

20-08-2018: A small change in perl regex handling. I usually run a scripts to generate from a bibtex version file to an independent file (for example to submit to the ArXiv which needs the bibliography included rather than generated with bibtex). Until recently the following would work (from the command line or within an other script). The script replaces the bibtex entry with the content of a file: perl -pe 's/\\bibliography{geometry}/`cat file.bbl`/ge' -i file.tex This does not work any more, the curly brackets need to be escaped. The error message is: Unescaped left brace in regex is illegal here in regex; marked by perl -pe 's/\\bibliography\{geometry\}/`cat file.bbl`/ge' -i file.tex I love established and entrenched programming languages like Perl, which do not mess with the user (as even small changes breaks a lot of stuff). This is a rare case in Perl, where the developers badly mess with the users. Changing basic stuff like how regex stuff is handled in scripts should never be changed. If one googles the error message, one sees that this breaks a lot of other places. If the developer things something is handled not well, it should be implemented by allowing the user to set a flag enabling the new feature. Now, in my case, this is not that bad. I have to adapt that particular script only in 41 files, but I still don't know what other automated stuff will fail now.
06-08-2018: An article in Aeon by Nicholas Tampio provides some critique on "screen based learning". I think, there is a point. Real experiences with actual physical objects can give more insight. I myself believe to have got a lot more intuition about numbers by playing with Cuisenaire material (also in number theory) or Lego, Mecano, electronic boards, chemistry kits or building stuff. Books of course were also important. Of course, there was no web, when I was a kid, but TV instruction then is pedagogically essentially the same than Moocs today (just that you do the homework on paper and not on the screen) and that the lectures were broadcasted at a specific time (which prevented procrastination). The article mentions the book "Pnenomenology of Perception" by the French philosopher Maurice Merleau-Ponty who stressed the lived experience, the "doing" over the "seeing". I myself find such discussions always a bit tiresome when done in the abstract. It is difficult to theorize about pedagogy and learning. The real experience can be different. And one of the issues is raised in that Aeon article of Nicholas Tampi: real experiences, real people, real teachers, real engagement only comes with real interactions. This happens less when looking at a screen. Somehow, the screen can distract. I had just 1 and a half day without internet and phone at home because of some networking problem of our internet provider (at one point I was one hour on phone support). The issue got resolved. But there was one thing which was good. I was forced to focus on local things for once and used the blackboard for thinking and got an idea for a new proof of something. The topic of "screen based learning" might be complex, but as a general rule, keeping a healthy balance can be a good idea.
05-07-2018: Something about exploring creativity through computer algebra on Medium. It is also mentioned now on my ``creativity page. Related to computer algebra, checking for Hamiltonian property of graphs is hard. This can be illustrated with checking for the Hamiltonian property of the graph with vertices {1,2,...,n},where two nodes are connected if their sum is a square. So, far, the machine gets stuck when checking n=152.
04-07-2018: Google chrome will soon label websites unsafe if they are not HTTPS: (see the register article. ) Has the company "Alphabet" become so unsafe? This insanity needs to stop. One can understand the concerns for HTTPS but not every website is a service. Many older pages are just information pages without login features transactions. There is no need to upgrade them. These older pages are usually the more reliable sources. Upgrading to HTTPS is not such a small thing as often, many links are hardwired in. Certificates can be expensive and an other potential place where an other service can screw up as it is possible that the certificate does not get renewed. For a typical website, there is already the service provider, the domain name registration, which are different entities. Unlike wanting to pay additional fee which can become expensive with many pages, we need now a third party for the certificate. Then there is the problem with internal links which will no more work after an upgrade (I had seen that with a wordpress upgrade where it is relatively straightforward but still, all image links needed to be adapted). Google Alphabet has definitely turned unsafe!
Plato's writing is unsafe due to a fucked-up alphabet
15-05-2018: It is the time of the year to update a bit my machines. I upgraded two of my office machines from Ubuntu 16.04 to 17.10, and then to 18.04, both time with "update-manager -c". Went very well. The only snag was that from 17.10 to 18.04, the lightdm manager got lost. I'm still using the blackbox windows manager. Canonical announced to replace the X server with Wayland in version 20, but I guess it will be no problem then also to keep the X server running. Ubuntu 16.04 had been a very stable system. [Update 05/16/18: made also a home machine update from 16.04 to 18.04 in 3 steps. Needed a night and getting up a couple of times because the upgrader asks questions in the middle about configuration files. But things went well. Just needed to add the Nvidia graphics drivers "sudo ubuntu-drivers autoinstall". Some glitch with PERl which killed a morning: needed to reinstall perl although by running "upgrade" after "sudo cpan" and still have some library problems in PERL. How come, there is suddenly a "perl5" directory at home? I did not ask for it. Might be related to the PERL problems which came from a rather steep upgrade from 16.04 to 18.04. Needed to set the PERL environment variables by hand but still have libgtk3 problems. I might still have to do a fresh install of 18.04 after all, if there will be more issues.
30-04-2018: From a Mathematica workshop. Writing music with Mathematica
24-03-2018: The New York Times write about an other push of the Justice Department to Mandate a way to unlock phones. Its good that besides some Suites (Ray Ozzie and Ernie Brickell) also an expert (Stefan Savage) is on board. In technology and cryptology in particular, it is always difficult to say what is possible and what not. But in this case, it is not so much a technical issue. Whenever a second party (whether manufacturer or government or even some independent entity can open a device), then third parties can do it to. This claim can not be proven in general, it is just historically so many times happened. From the article: The idea is that when devices encrypt themselves, they would generate a special access key that could unlock their data without the owner's pass code. This electronic key would be stored on the device itself, inside part of its hard drive that would be separately encrypted - so that only the manufacturer, in response to a court order, could open it. I don't know but this looks like a very bad idea as if the system writes that access key into the memory, this part is accessible physically. Whoever thought about this assumes probably that there is will be some kind of decryption method known only to the FBI which allows to decrypt the device. So, what do you do then, if for some reason a third party knows how to do that? Suddenly, all phones, including the ones of the folks who came up with the idea, will be wide open to everybody. How long, until nobody will buy phones from US manufacturing any more?
18-03-2018: USB C multiplier hubs are almost not existent. I mean hubs which featuer additional USB C ports, not just the old USB parts. The ones one can buy are expensive, like this one. Virtually none of the cheaper hubs features additional USB C ports. A good hub would make the macbook more usable. Or then bring out the next Macbook with two USB C ports. The engineers seem not to get it. If the laptop is at the charger, one still needs an other USB C port for adding a USB C drive or device (USB C is faster, they will pick up anyway). The worst are the USB C hubs which don't even feature an additional USB charging entry as this means that hub can not be used for stationary use. Are there other reasons for this disaster? One could imagine expensive patents on USB C specifications or then technical problems with powering the ports. I myself have a NIGI adapter which fortunately features one additional USB C port. But the plug already wore out because I use it so often: at home I attach an external USB C hard drive for backup.
14-03-2018: To honor piday, I filmed a short story in a computer game. (The entire project is quite time consuming. Took in total about a day of work (filming and building a story). Assassin's Creed is well suited as the character can freely roam around. Unfortunately, one can not program the AI characters like the characters in the library or on the street but it is possible to tell a story. There is quite a bit of mathematics already built into the extension pack.
08-03-2018: The register honors the ZX81. As usual, the register title is hilarious: "10 PRINT "ZX81 at 37" 20 GOTO 10". But there is something nostalgic coming in. At that time, computers were REALLY exciting. Unlike today, where we just have incremental power increase, these were breakthroughs. I myself did not have the ZX81 but a tandy imitation. But what was nice is: you started up the machine and be right in a programming language. Yes, it was only basic, but you felt in control. It was a step forward from the TI 57 programming where one had to fight for each line of code in order to fit into the memory.
17-02-2018: Small is beautiful. The computation of all cohomology groups of an arbitrary simplicial complex in 6 lines. This compares with computations done in a triangle illustrated in Math E320 (see the [4 miracles Mathematica notebook].
04-02-2018: Just asked Siri who won the Super ball. Siri gave the last years results. Not watching TV myself, I had assumed that the game already took place and that Siri is just clueless. I thought having closed the conversation and exclaimed: "F..., you don't know". Siri answered: "Sorry, Oliver, I try to be better next time". After I found out that the game has not taken place yet, I realized having wronged the machine in error. The funny thing is that I felt really felt bad having shouted at it. Last week, a student suggested to me to ask the machine whether it wants to merry me. Siri: "Oliver, we don't know each other long enough." Good answer. A better one would have been: "But you are already married, Oliver".
08-01-2018: The Meltdown und Spectre CPU desaster (Supergau) produced a lot of discussion. This could be an opportunity for open source processors like . I bounced this once at a Heise forum. This is nothing new but the task would be formidable to be only close to commercial processors in performance. It is kind of a sad modern development in technology (also in Artificial Intelligence) that the development resources are so large that it has become a part of industry and is not part of academia any more. Now, we see one of the consequences of having these technologies done in opaque frameworks which nobody can examine or then only if it is too late. Not that open source is immune to gaffes but the risks are lower as more eyes look at things. Now we are fu..ed. There is no other word. I updated my linux machines and already see a performance drop and the machines of course are not secure. In the end, I don't think it will be too bad for the chip industry. They will be able to sell new processors like crazy once the new processors are out. Yes, there will be law suits, but essentially everybody will need new processors without speculative execution or implemented so that it can be disabled.
29-12-2017: An important article pinpointing one of the key issues in computer-human interactions: latency. It drives the user insane. We have computers 1000 times faster then the very early ones but latency is larger. Even if its the very tiny delays when typing. It is one of the main reasons, why minimal windows managers and linux are great. On linux, one can control any processes which delay.
23-12-2017: This is the time to try technology. I was reluctant so far with buying an iWatch. Gave it a try. It is quite good. I like Gaia, the GPS hiking app. I have been experimenting with small GPS devices for running in the past and it GPS not always connected reliably. This is also the case for the iWatch. Still needs to have the phone present. The greatest problem with the iWatch so far is that the inability to close some apps. Some don't want to quit. The apple outdoor app for example can not even be killed by rebooting the watch. Also Sudukuo and 2048 (which are both nicely adapted to the watch) can not be reset. I had to remove them. Still don't know how to disable the workout app.
23-12-2017: I'm disappointed at apple however to change my iphone setting to HEIC format. Now I have to convert hundreds of pictures to JPG. It is possible to set the camera preferences to JPG (compatible format) but in a recent IOS upgrade, it must have automatically pushed HEIC. I use now the converter HEIC for the few hundred last pictures done in HEIC. It will probably take some time until image magic will convert things. I use for now the free "iMazing HEIC converter", which works quite well. Still it would have been a nice curtesy of apple to inform the user that they get pictures which can not be opened by most applications.
22-12-2017: I like firefox and support it as I don't want the browser to be dominated by a single company. The war on non-HTTPS URLS is annoying however. I can with firefox no more login to my router (which is of course not HTTPS) as it is on a LAN). Now, if I access the page, a warning dialog appears which prevents actually to enter the password. And then, since clicking away the dialog counts as a login, the router shuts down. Also, changing to HTTPS is not as trivial as it might appear. I have recently changed a tiny wordpress blog "quantum calculus" (60 pages only) to HTTPS, but since every linked image has a http: URL, I had to change this by hand and still missed some. In general, it is not the business of a brouwser to educate the user. The next step will be a warning message every time a non HTTPS link is accessed. Get a grip, firefox, not everything needs to be encrypted. If they don't stop this, I will delete you for good and compile my own.
21-12-2017: Something posted to this article. Why do we need a machine which does all? I need a workstation which is always on, does its daily chores, where I have a gorgeous large screen, which can also be used to watch a movie. I need a laptop to write with a clear screen without greasy finger prints. I like to touch the screen when consuming content on a tablet, but when working, fingers are forbidden on the screen. I need a tablet to read larger books and I need a phone to quickly look things up or to communicate. The phone needs to be small so that I can always have it with me, the tablet large so that also diagrams can be read well. Combining things always means to compromise: we have had printer-scanner combinations. They failed for both tasks. It is better to have a cheap and small reliable laser printer and a good reliable scanner which knows its stuff, can OCR, is fast and reliable. The prize of a device can be justified if it saves time, winning half an hour a day means 150 hours a year. Time is money too. For a laptop especially, I need reliability, a strong keyboard, a good screen. A 1TB drive would be nice too. But because there is lots of wear and tear, I personally prefer to buy relatively cheap but still good laptops and replace them frequently, delegate tasks which need heavy CPU or large amount of disk space to the workstation.
18-12-2017: Perl is 30 years old. A Heise article laments about the enthusiasm for the language. I agree. Perl is a great language. One of its strenghts is that it is stable. Happy birthday. Like a swiss army knife, it is a powerful tool and amplifies the shell. It is maybe no more fancy, but it does not matter what the masses think. A language which is 30 years old and still going strong deserves respect and also investment. At least it is safe from "rennovations" which destroy old programs. I can count on that scripts I wrote 10 years ago will run in 10 years still. I maintain a few "old fashioned websites" like rhetorik.ch or rheinfall.com or blogs like this one on graph geometry or course websites like This and then larger ones. All written in perl, some of the code almost 20 years old. But I have since decades not to spend ANY time on sysadmininstration. It runs by itself. I can focus on the content and the math rather than having to maintain stuff. Yes, content management systems, especially written in PHP, have taken over. Even so, I also maintain such a wordpress blog, quantum calculus, there are disadvantages of CMS: pages written in Perl are "documents", they are static, periodically generated pages which can be ported anywhere, independent of technology. They are documents which can be referred to. They are also FAST, very FAST. And stable and less vulnerable to attacks. 12-18-17
02-12-2017: A wise recommendation: keep paper backups for voting.
23-11-2017: A good article on the intel Management engine disaster. Nov 23. Some background about this troubling technology is given here. The talk of Joanna Rutkowska is here. It starts with "Personal computers are extensions of our brain. They are insecure and untrustworthy". An example of a well given presentation. We usually have assumed that hardware is trustworthy. One of her conclusions: today we can not assure secure boot. Rukowaka tells: "ME is an ideal backdoor and rootkiting infrastructure". It is part of "zombification" of computing: the hardware contains operating systems, which nobody can look at and which nobody can disable. Not even a secure OS like Qubes can prevent ME to take over.
21-11-2017: We are on the brink of a most terrible technology decision: the repeal of net neutrality. A NYT article puts it well: the internet might become a "pay for view" technology, at least in the US. Why a single person like the boss of FCC (a proven lobbyist of Telecoms can make such a decision on his own, is totally beyond me. It might lead to a much weaker US economy in the long term. There were other attempts of bad decisions recently, like health care changes which border at making it appropriate to call the lawmakers terrorists as it would have terrorized a large part of the population: (the definition is "the use of violence in the pursuit of political aims, religious or ideological change".) About 25 million would have lost insurance meaning the death of tens of thousands of Americans. Definitely much more than 911. (John Mc Cain with his famous midnight vote "thumb down" probably saved more lives than any general in the history of mankind). As taking away health care obviously kills people, it is an act of violence. It is not stabbing somebody to death, it is just watching the person bleed to death without doing anything and that is violence too. Changing net neutrality will not kill people, it will kill businesses. Maybe not world wide, as other countries are not that stupid. One can just say, it is not only idiotic, it is also deeply unpatriotic. Here is an article in "Entrepreneur" explaining a bit the small business aspect. And from the many cartoons:
17-11-2017: Just got one of these 5TB USB 3.0 Hard Drives for backup. 140 dollars. This is great for long term backups which are not overwritten. Having a growing digital library to backup requires larger capacities and 5 TB currently is enough for a full backup. Why is it important to have a local electronic library (books, music or movies)? First of all, the streamed content is changing. Netflix might offer a movie now but no more in a year. Streamed music changes. You might hear a song now. In the future, the song has changed, been modified to a "more modern taste". This could just be the beginning. It could be for example, that some movies are changed or modified, maybe because a scene has become too offensive, maybe because an actor is no more wanted to be associated to the movie and that part cut out. There are typically many versions of a movie available, extended versions, director cut versions etc. You might want to hold on to a version as in the future, with a streaming service that version might no more be available, or modified. In a dystopian future, we can imagine that electronic books are censored and changed. We are already in a time, that if reading books with electronic devices controlled by third parties, every of your readings are recorded and registered and statistically used (how long did you read which page, how much of a book did you read, where did you read it etc). Files can be modified, changed, censored, cleaned from possible politically incorrect parts (the taste changes with time) or offensive parts or parts with critique of a regime. We are already there. There are "clean versions" of movies available, where for example any violence is gone, where bed scenes are gone, where inappropriate language is cut out or "beeped out". Some documents might be deleted due to some legal or other quarrels. If you look at history or other parts of the world, there are many instances why access to information has been disabled, maybe because of ideological reasons. It happened in the past that documents bought on kindle were "repossessed". Music, text or Movie documents could in future, without you knowing it, be modified, or watermarked or modified or cleaned out. This already seems to happen with documents submitted to the "cloud". As the cloud provider does not want to store too many identical files, it might "replace" your file with an other "identical one". We don't know to get back our file or not a maybe a new different version? Some documents might in future just disappear. Decades ago, one had to burn books in order to ban them. Now, it is much easier and more subtle. Just modify the book and keep it available. A user bound to a dumb device like an ipad or kindle might not notice. If you own the file, you can OCR it, and compare text or sound or video. The technology to modify documents, pictures even movies have increased dramatically. So, thank you very much, but I keep buying my media (books, movies, or music) and back them up. A look at history shows that blind trust is not always best.
12-11-2017: A nice article about augmented reality ("data-vomit gush"). There is especially a link to a movie showing the first experiments with virtual reality by Ivan Sutherland from 1968. It is related to the MIT Lincoln Labs in Lexington. Sutherland is also known for Sketchpad. Currently, the European tech sites like the Reg or Heise kick ass. Heise just now has a nice article about how Face ID was cracked with a mask. By the way, when looking back at these historical videos, it becomes evident how much ahead universities (like MIT) have been at that time! Now, cutting edge technology is outsourced to companies. This happens also at an amazing speed also in higher education. They don't even try to fight. It makes of course sense financially to outsource IT, to outsource mail, to outsource course websites technology, even to outsource teaching. But it will soon be mean the end of a golden age of "higher education" as a place where innovation happens. Impossible? We have seen it happen in the automotive industry. It is not inconceivable that in 50 years, Boston is the new Detroit. If this looks ridiculous, just look at how far ahead the Lincolns Labs were in 1968. Companies like Microsoft (1975) or Apple (1976) were not even conceived then. P.S. There had been previous times, where industries were ahead of the game. IBM, Xerox or Bell labs come to mind, so, it is maybe not such a new thing. It is just the scale which is much different.
11-11-2017: Installed High Sierra on one of the macs. I think the system is now faster. No problems so far. Actually quite amazing as so much has changed under the hood. As I had performance issues with Keynote on my laptop, I also upgraded the laptop. Maybe it can now run Zoom and Keynote at the same time.
11-11-2017: A bit modified comment posted on this story: What I want from a programming language are Standard, Stability and Speed. Nobody minds the little quirks, redundancies or the lack of elegance. When I program something today, I want it to run in 10 years, without modifications! In particular, I want the language to be around still, the grammar once put stay a standard. I want the program to run stably. In particular, I expect developers to be VERY VERY careful when changing the compiler. Even small changes annoy. C has been quite good but recently, it was no more possible to run gcc -lm example.c . Linking the math library required gcc example.c -lm. WTF. One has to change now 700 Makefiles just because somebody thought this is more elegant? I don't mind if a language is extended or sped up, but don't for change old grammar, not even the smallest things. There is lot of code around which would need to be fixed. I'm in particular cautious when adopting new language, even if it is only a wrapper. They first hype and spike. In the worst case, the developer gets over excited and changes the language again and again. In the second worst case, the language gets abandoned. A language needs to earn respect, prove that it is stable over a long period of time, that it is reliable and fast.
06-11-2017: My Zoom Setup for teaching Math E 320: A picture from Monday, November 6, 2017. I had problems to run both Zoom and Keynote on the same laptop. I currently feed the slides from a second laptop which joins the meeting too. There is a large monitor attached which makes things also more comfortable. Click on the picture below to see it large (10 Meg file).
24-10-2017: Some extended comment to This register article: The analogy with utility is deeply flawed. Information is not a utility. It can be (1) sensitive and (2) crucial (3) requiring big pipe capacities and (4) require a healthy IT culture to be handled properly. We have played as a clueless kid on mainframes asking "mommy" (sysadmin) for computing time have been autonomously and educated and return now to the nursing home, paying the nurse (cloud provider) for every second of service (computing time).
  1. Information is not a utility. Water, gas or electricity do not contain possibly sensitive information, which needs to be protected. If a utility provider goes down, it is bad but not deadly. Losing data in a "cloud" or having data diffuse away to a third party, can kill a business as leaked information remains leaked for ever. If one of the major cloud providers loses control, it could even lead to a recession as many businesses would fail. Water, gas and electricity are information-free quantities, data files are not, they can be personal and crucial for a business.
  2. Information technology is vital. A power station going down or a water pipe gets repaired is a temporary inconvenience. Data loss or data leak is unrepairable and would be especially bad for financial, health and educational sectors. As a private person, I can survive for weeks without internet, electricity and gas, even water and still keep up essentially the same productivity. A modern laptop can be powered by solar, it is possible to work even in candle light and water could be bought in bottles. Such a resilience for IT is not possible with cloud IT.
  3. Information pipes are way too narrow An big problem with delegating IT to third parties is the internet infrastructure. Especially in the US, it is weak and expensive. The last mile is the main sore point. For any utility like water, gas or electricity, the capacity is not a problem. Now, with net neutrality currently dying in the US, it will even become worse. We will have to pay more, maybe even more for backing up large amount of data on a foreign data server.
  4. Lack of a healthy IT culture. A consequence of delegating things elsewhere is a loss of IT culture. In the short term, it can make sense as still, the cloud suckers dump the prizes to keep people hooked and destroy local IT infrastructures. Once dead it is difficult to build it up again and higher prizes are likely to follow. Yes, it is good that we don't have to uudecode an attachment by hand any more and that most computers now have almost zero maintainance, that backups can be automated onto a time machine etc but it also means for many institutions that the IT culture is shored out.
22-10-2017: The exhibit Can you hear the sound of a simplicial complex uses MP3 files triggered by mouse click. I first used "onmouseover" but sound is in general annoying in webcontent, when appearing unexpectedly. Most of this page was generated pretty automated. The eigenvalues of the matrices corresponds to the sound frequency. Mathematica generates the sound and image files.
17-10-2017: The limitation of twitter to 140 characters is a standard which should not be given up lightly. We have a new unit, "the tweet". If twitter will change it to 280 characters, it should be called differently, like a "roar". Limitation is an interesting challenge, especially in code. Sometimes, one has to fight a bit, like in this post on the energy theorem. I had to leave away the semicolons, after the definition of the connection matrix and the definition of the energy. But I wanted to cover the complex given at the beginning of the talk about this energy theorem. I think twitter would make a "cultural" mistake as 140 characters has become a "cult". I wonder what the tests will reveal.
26-09-2017: After upgrading Keynote, it started to have some hickups when exporting a movie. See here. Keynote has improved a bit the performance. When using Zoom, I have had terrible problems, almost bringing down my machine. I still now present from a second computer as Keynote sucked all resources from the machine (a brand new macbook). Unrelated is the problem that keynote uses a lot of resources with large presentations. I have problems running it on the same machine together with Zoom, while teaching. My solution is to run a second laptop on a second account, join the meeting from there. The second laptop is only used for presentation and has no video in zoom. This works.
09-09-2017: While looking up information on log tallys for an lecture in MathE 320, (see blog), I came across some papers made available in google books. Google books is a great project but starts to close up more and more. It needed some work to get this article and place it onto a local machine: screen shoot page by page and glue it together. The log tally is on the Wikipedia as well as on several blogs incorrectly attributed to Schenck because the google book document shows this book title. It would really have helped and prevented misunderstandings if the entire book could be downloaded as a PDF. It is a small thing but it contributes to a feeling that we live more and more in a time of "IT infantilisation": music and videos are streamed, not owned. Books need to be read in reader devices or software like kindle or "google play", where readers are tracked about their progress. Using software and media "as a service" one is evaluated and constantly monitored by a main frame server somewhere. It is a "cloudy business". Already major applications like "Google docs, Microsoft Word, Photoshop", calendar software, note taking software or backups. Heaven forbid that a user or "customer" has anything they "own". It is better to have the user as a child who needs a guardian to function. Even computer algebra systems have now cloud versions. I stopped using Adobe photoshop once it was "on the cloud" and would also stop using major CAS if they would go "cloud only". It is not only the users who have become kids who are constantly watched and controlled. At the moment, entire industries and universities outsource their IT structures. If the three main players Amazon, Google and Microsoft would cave, then not only their industries would disappear, essentially everything would collapse. The players have become too big to fail but are still not too big to merge. A brave new world scenario is where it is impossible to read or write anything without being tracked and marketed, where information is controlled by two to three players who due to lack of regulation and the few remaining players start to syndicate. Even more scary is the prospect of disappearing personal computing infrastructure for the home user as computing can only be done in smartphone like operating systems, where the user is jailed in or then is billed when using "computing as a service" on the "cloud". In such a world, a new player in the industry has little chance. Their innovative ideas are mined directly from the servers and fed into the artery of some giant. Not that people have to "borrow the ideas". It will be machines, trained with sophisticated algorithms that search through peta-bytes information trusted to a few servers. It is necessary to make as much information as possible public. But it should a also a matter of choice what becomes public or part of a third party and what not. A start-up, building up ideas, needs to be able to do that without being side lined by a large bully. Health data, start-up ideas, financial data or voting data need remain safe. One could imagine for example a software which goes through some cloud servers looks for new ideas and submits patent applications if something interesting has been found. In the near future, it could happen that "owning a file" on a local computer is technically impossible as the operating system is by design told to share everything with a central computer. A hack of a centralized system or a collapse of a data provider will be much more severe. Just two days ago, it was announced that the credit information of 143 Million Americans has been exposed. Certainly, "big data" analysts already have started to mine and sell this data, as it is very valuable. The "equifax super gau" prompts thinking about "decentralisation". There are data sets which need to be safe and off the public (like bank, credit, voting or health information) and then there are data which need to be free and public domain, like an article written one hundred years ago. What is needed? First of all bulletproof strong cryptology for industry and private folks (this already exists fortunately, but there are forces which try to take it away). Second, less centralization and more diversity in IT structures. Third, a healthy group of IT in each industry and university as well as a well educated general population who can stand on their own feet, handling their basic computing needs so that one can not become a hostage of a few giants, who if one is going down takes everything with them. It appears also healthy if copies of media are kept independently. A distopian future like Fahrenheit 451 is still a possibility. Technology has enabled to censor or change media content, not only text, also pictures and movies. Having only centralized "Cloud" versions would enable such manipulations. This already happens in various places on the world. ) [Update September 24: Cloud computing just has started to charge by second. It reminds me of an Encounter with Goldbach at a time "Main frame computing" = "cloud computing" had its first appearance. We were infants at that time. We again have become infants today. Anyway, it is psychologically bad especially in development and research to be billed by a service. If one makes the investment in local hardware, it is encouraged to do computations and use it to the fullest. With the service model, a researcher has to question every second of computing time. Mommy, do I get a dollar to do this computation?
18-07-2017: Links for a technology demo for today: An animated picture Strong lattice Fluid dynamics fluid Bubbles Vortex Sphere Surface cloud
11-07-2017: An important message of Vi Hart:
11-07-2017: I use my 12 inch macbook every day. Maybe 5-6 hours per day in average. Now 2 years old, there is now a battery service warning. Yes, the battery empties faster (5-6 hours now rather than 10) and looks fine but still, it seems that life will not last too much longer. Also the keyboard shows its time. I type a lot. Some keys lose their key marking which is not a big deal, others have started to become less reliable. I cleaned out some like the space key but removing it risks breaking up one of the tiny plastic latches (which happend to me). The keyboard would also need to be replaced. The risk is now here that one of the keys breaks for good making the laptop unusuable. I have done replacements of individual keys for mac air laptops before but it is quite expensive. To service the battery, 200 dollars, to replace the keyboard again at least 200, then the time to schedule appointments with the genius bar etc, a couple of hours and having the laptop not available for weeks. It would just not be feasable. I decided to use the still well working laptop now as a backup machine and get a new 12 inch one. The strategy to buy relatively cheap laptops but replace them regularly appears better than having an expensive one (Pro) but still face the same long term problems like battery, harddrive and keyboard, which just happen to fade after 2-3 years of heavy daily use. I use also the same strategy for bike which drives has at least 3000 miles per year. (I drive rain and shine, snow or heat, every day). After 2-3 years also, the bike starts to fail everywhere and servicing it costs half of a new one. Also here, "buy relatively cheap but replace often" appears to be more effective than having a really expensive one. Then there is the risk of having it stolen, which both for laptops and bikes are just there and which just would be devastating with 3 times more expensive laptop or 10 times more expensive bike.
04-07-2017: A vulnerability in RSA incryption illustrates that not only the mathematical security, but also the actual implementation is important. In this case it is the way how the modular multiplication is done. This allows to recover some of the bits. Important work as crypto security is crucial for a functioning society (banking, trade, health care, voting). See the Heise.
22-06-2017: Why does one use in HTML while TeX uses \infty? The discrepancy is kind of annoying. The infinity symbol was introduced in 1655 by John Wallis. But who is to blame for the incompatibility? I think it might have been HTML as the Unicode Consortium was incorporated in 1991 and the first versions built in 1986-1987. TeX was released in 1978. ASCII came earlier but does not feature the infinity symbol (which is kind of a shame if one looks at the other things which have been chosen instead: in the List of ASCII codes) . Apropos: the incompatibility between different languages is not a biggie. The extended ASCII flavours however were and we still have to suffer from the sins of coorporations trying to embrace and destroy competition and invented their own character or even ASCII versions. Still today, both in Adobe as well as in Word texts, one has characters like -, ", which look ASCII but are not. Platform specific character codes remain annoying. It is good that both the unicode and W3C consortium have got their grip together.
17-06-2017: Having switched my 4K monitor as a second monitor for the mac, I have tried a curved monitor (Dell UltraSharp U3415W PXF79 34-Inch). With a 3440x1440 resolution it does not match my 4K monitor with 3840x2160, but actually (maybe because my eyes also get older), I prefer to have a bit of a larger font while working. The widescreen (21:9) aspect ratio is very comfortable to work with. Here is a screen shot (click on the picture to see the full 3440x1440 pixel screen shot):
16-05-2017: A rare event: youtube is down. Interesting error message, (for google developers to debug): (click for larger picture) .
15-06-2017: A heise article illustrates how Etherum has heated up the crypto currencies. Ethereum is a gold rush, while bitcoin tanks (for now). These things are always a bit of a pyramide scheme but the block chain technology looks hotter as one can run code in decentralized applications. It also allows to build smart contracts. The Etherum virtual maachine is a turing complete software which can run any program it is kind of like a universal Turing machine. This makes it interesting in a more general sense. The Ether currency shows exponential growth ether or bitcoin.
10-06-2017: The SEO optimizers have become more sophisticated. It used to be stupid. But today, I got a personal email from a "math student" who for a "geometry project" needs to have a page linked to get "extra credit". Who does not want to help a student? The page however did not look like a project page. Yes, it had some information on it, but not done by a student and only remotely related to geometry. I asked back for the name of the school and the name of the teacher, but it was probably a waste of time. Must have been spam.
09-06-2017: Apple programs like Final cut, Garageband or iPhoto feature an annoying violation of "clean slate policy": the program by default starts with previous project loaded. This is sometimes useful yes, but annoying if one works on many different projects at the same time. Yes, one could organize produce different library, produce smart collections etc, but it is an attempt of the program to emulate part of the operating system. I don't want to rely on a program to get organized and personally like to start every program with a clean slate and that if I start with a project, then only the components of that projects are known to the program. Keynote, an other program of apple does this nicely. I can open a project "open presentation.key" and do not have to worry about other presentations or work with different settings etc. If I open a project "open project.finalcut", then the program should not know about older parts. Now, even if you use Final cut and move a project somewhere else on the hardware, the program will still find it and sometimes even load it. As I don't want to throw old projects, I put them into an other folder and make that folder invisible (chmod 000 backupfolder), then work on a new project. I do the same with the apple photo app. I'm not interested in pictures taken a month ago. I don't want to have them even somewhere in a library nearby. I want to start with a new film having organized the pictures I want to keep elsewhere. Also here, I now just put the old library in a directory and chmod 000 it so that the app does not pick it up. Similar violations of independence has started with the browser, where the program also wants a larger share of the operating system. I want the brwoser to start an independent process, in which I'm not linked in to services like google. I might work on different parts where in one browser I'm logged in for one project and on an other browser on an other project and are required not to know about each other. Mathematica also violates this policy when using the GUI. It does work well however if one uses Mathematica from the command line, a reason, I mostly work on the command line. This enclosure mentality is annoying and assumes that a user works on one thing only. For the webbrowser, I use now different browsers for different things to make it independent, like separating department work, administrative work, work for research, or work for teaching or work for family or then private work. It would be easy to fix. Whenever the user starts a browser new, it should start an independent process. Or one should be able to configure it as such. This is the default for most applications. Why is compartimalization important? It reduces the risk of mixing up things and adds more accountability, in case something goes wrong in one part. It puts the burden of organizing projects to the operating system level and not on the individual programs. Localization and decentralization simplifies and is more robust. It also produces "commutativity" of actions. Having everything loaded at the same time makes things depend on each other. An other reason is that most programs now communicate with some server, sending information forth and back. I'm waring different hats when working on different projects and don't want to have to change my computer to change from one project to the other. So, back to final cut: the last couple of days, I was uploading 30 hours of youtube videos for a conference (it is a project with a half of a TByte of movies). It is important not to get mixed up with different videos and renderings of different sessions. It is a time consuming process where not much can be automatized as rendering and uploading takes hours and because each video clip needs to be trimmed and annotated and because the uploads fail (probably every third upload needs to be redone or done several times, the reason being still mysterious. I first thought, it is the hard drives going to sleep problem [which is an additionally unrelated annoying feature of many external drives burned into the firmware so that one can only bypass it with helper programs touching every few minutes a file on the drive]. These upload failures happen also with an essentially fresh final cut setup. As I have Terra bytes of movies in my libraries, one could easily blame it maybe on the too large library. Now, I know it is a bug which must be blamed to the ISP sometimes resetting the network, or to a final cut instability or then a youtube problem. Strangely, it seems to be more frequent during the day than the night which would point to a network instability problem (Youtube just comments "upload canceled", the sharing progress usually stops around 51 percent). As usual in IT, it is the failures and limitations of tools which make up for the time consuming parts, as one has to find ways around these limitations or redo things. It is not like 40 years ago, where virtually everything in IT would fail first [When starting with experimental mathematics as a teenager, I had to store my first Basic programs on tape and often, even that basic saving process would fail but that was the norm]. Now, bugs have become rare, but they still eat most of the time resources. And because bugs are rarer, the are perceived also more annoying.
01-06-2017: An article in NPR about soundtracks produced by computer composers. This is fascinating. We have for a couple of times used Mathematica to compose. Examples> See also the lectures on Music and Calculus and AI.
31-05-2017: An other 5 hours of rebuilding my office machine. Since this is unpredictable, it is good to do this between semesters. While switching hard drives, one of the SATA cables broke off. It was the Sata connector to one of the important drives which got stuck in the part of the cable, killing both the cable and the drive. I got really mad because it was a nice new harddrive. I decided therefore to get one of these hot swappable harddrive containers (IStar 2BAY 2 x 5.25 To 3 X 3.5 Cage) and also got new sturdy SATA cables. Since my workstations are silent Thinkmate machines, I was worried that the additional vent would be noisy but the enclosure shields it well. I did also a fresh install of the operating system as my SSD has gotten old. Some minor surprises in Ubuntu 17.04: Perl ignores now by default local libraries and local files. An entry "export PERL5LIB=./:$PERL5LIB" in the .bashrc file solved this really annoying feature. It was the @INC variable which is set when Perl is installed and which does not look for local libraries any more. What were they thinking? An other hick-up with the ftp server which accepts pictures from the LAN webcam. I should have known. It is not the first time, but if the configuration file (here /etc/vsftp.config) has not the right permissions and not owned by root, the server does not start up (without complaining). Ubuntu now talks too much, everything if one does ssh's into the machine. The chatty motd scripts are in /etc/update-motd.d. One could delete them but they might be handy at some point. The easiest to shut this off is to edit a file /etc/motd containing what one wants to display. Now it just gives a line telling when and from where the last login was. Also took the opportunity to upgrade Mathematica.
26-05-2017: It is rare these days to get into the case sensitive trap on OS X. I regularly sync a work directory from my office machines with my laptop which has a non-case sensitive file system. If two files like g.pdf or G.pdf are present in the same directory, one will bite the dust. It is usually no problem but I just got bitten by this once more. I format external drives on the mac with case sensitive file systems but it might still be a risk to do that for the main drive. OS X is well done and almost perfect, but the case sensitivity is one issue which needs to be solved.
24-05-2017: Spent an afternoon with a strange bug on my home machine. For some reason, the ubuntu installer always produces a garbled screen. The machine is fine, the graphics card is fine and works both under linux and windows, the install media are fine (work on an other machine. I excluded USB problems by using various flash drives, or USB harddrives, tried out various other BIOS settings and then also used an other monitor. I currently suspect that it is a low level graphics card mode which is buggy, either on the motherboard or then on the graphics card. Any way, an afternoon gone.
12-05-2017: Our phones are now voice over IP. It is funny how the information leaflet mentions "voice mail service has moved to the cloud". Dudes, it is just VOIP, web, internet. But I guess, now everything has to be the cloud due to marketing reasons. One of the arguments against VOIP had always been redundancy and that things work even if the network is down. But as now most have cell phones, a traditional phone line in case of emergencies is no more so important.
26-04-2017: The registar of today mentions the plan of Ajit Pai (head of FCC) to kill net neutrality. No wonder, this guy was close to Verizon before going into politics. It would not surprise if he still is close to their lobby. Killing net neutrality could be one of the worst consequences of the Trump presidency which so far has a common theme: totally unqualified people are put into positions they never should be in. Even the relatively conservative "The Hill" calls it a "war on consumers". EFF calls the proposal "devastating for competition, innovation and free speech". Indeed, its consequences could be terrible both for the economy as well as for democracy. It is time to contact the representatives.
18-03-2017: The Google JPG encoder Guetsli is everywhere in the news. Here is the google blog and here is the paper explaining the iterative optimization. I could not compile it from scratch on an older ubuntu 14.04 but on OS X, it compiled well. A test with a first picture gave 7 percent reduction from 29981 Bytes to 27969 bytes. A compression with this picture did not go through yet. Probably too large. For the smaller version it took 40 seconds to reduce from 162589 to 120391 (35 percent). Not bad. But for the larger 12 Meg picture, a reincoding would take an hour. It would take days to reincode one of my panorama pages. It is not the first time that a swiss name has been used. There is also a Zopfli compression algoritthm by google. Why Swiss names? Some of the Google researchers like Jan Wassenberg are based at Google Zuerich. Wassenberg came from the Fraunhofer Gesellschaft, German research organization. The JPG 2000 data compression standard came from there which has, similarly than Guetzli a small compression advantage. It largely failed however because hardly any browser supports it (Firefox and Google chrome does not) and also because it is riddled by patents, which is a death sentence. [Update: In a test with a large panorama with 12 Megs, Guetsli worked for 2 hours and got out a 14 Meg version. The algorithm definitely seems have to have difficulty with very large files.]
12-03-2017: I do essentially all my work from a terminal. This is why crashes of the terminal app are especially annoying. I use xterm since about 30 years and it had always been stable, one all platforms, even on thin clients or over slow modems. OS X Sierra is the first exception. It is a known issue. I find it happening more frequently when editing files with long lines. Fortunately, unix apps like vim have built in recovery so that one does not lose data, when writing a program. It is still terribly annoying and the stability of fundamental apps like terminal should have the first priority. The issue has been known to apple since last fall.
26-02-2017: When trying to upload this clip it took only seconds for being banned from youtube. While this spoof was accepted (with adds), the Rammstein clip is seriously protected. One definitely has to accept evenso, I believe fair use still applies: no monetary part, no damage for the producer. It is maybe not sufficiently small.
18-02-2017: An alarming trend: IT job reductions in the US. Of course due to the increased centralization. Also universities have the trend and outsource more and more of the IT. It is sad, as it used to be that the IT developed and maintained at universities were on the cutting edge. It was encouraged to tinker and experiment with technology. Now things to to third parties, companies which can do things cheaper in a centralized manner, possibly abroad or in data centers where labor is less expensive. I personally believe this will come with a great revenge. First of all, the IT reduction trend is demoralizing for young students interested in tech. Actually this demoralizing effect could hit us in a few dozen years very hard and in many ways. But managers tend to think short term, even at universities. Here are a few reasons why thinking short term is dangerous: 1. It becomes already today harder and harder to convince a young person to pursue a STEM career. The myopia of leaders not understanding the fear of becoming obsolete and powerless has even led to Trumpism, a phenomena not even dreamed of one year ago. 2. Centralized data centers (lets call them the "revenge of main frame computing and thin clients") pave the way to a risky future. A meltdown of a major player now already would risk the operation of industries, university or even the economy. 3. We are still in a "buy in phase", where vendors dump the prizes to destroy competition and local IT structures. Once gone, everybody will be hooked and is required whatever prizes are prescribed by the soon to be monopoly. We see that already in internet access, where the prizes are unreasonably high, due to the lack of choice. 4. History shows how fragile a political landscape can be (and how important technology can be to manipulate). I think that the current IT structures built (a few big players controlling information) has made it already much easier that a totalitarian state can become a possibility (even in the US). There are some flood-gates and safeguards in place, one of them (on the technology side) is strong cryptology, an other the availability of open source operating systems, but there could come a time when it will be difficult for a new start-up to enter the field and compete as the important information structures are controlled by a few players who own the patents, the pipes and the power. But there is not only doom: we also live in a great time of technology. Our operating systems in desktops (like linux) have become rock solid. Even a personal data center of a dozen tera bytes of data have become cheap. And being able to carry around an entire library of books in the phone would have been just unthinkable 15 years ago. P.S. I just remembered that I made my first steps in computing on a main frame using thin client. This was convenient. I did not have to carry around floppies and backup things myself but it was also risky and indeed, for some reasons, I have lost most of the work done on that main frame. Only some printouts survived. I would give a lot to get back what I wrote at that time about finitely presented groups, to recover the projects done as a course assistant (like a cool project in which the students had to write an AI program solving the rubik cube. This means not just implementing a solution of the Rubik but finding a solution path using the Schreier algorithm!). To be fair, I lost also source code to many pascal programs I wrote as a student on the Macintosh or Atari. There it was just negligence to backup things properly or having misplaced the backup floppies. But I have still tens of thousands of pages of not -yet- digitized diary books with mathematics and programs rotting in the basement. Maybe I will once dig through that and scan a few things in.
28-01-2017: It is a complete disaster still with USB C hubs. There is none which allows charging and using other USB C devices at the same time. There is one product on the corner but pre-ordering is not so much my thing. I need to have something right now and guaranteed. It seems however to be a fact that there is nothing available which allows to use the Macbook access an external USB C drive additionally to charging. Currently, if I use my USB C drive for a backup, the battery will be drained until everything is backed up. I turned therefore for a network attached storage solution for my laptop backups (and then of course, the local syncs from laptop to desktop which are done with "rsync").
23-01-2017: A great Ode for VIM, and especially its attitude to innovation and features. And especially the urge of developers for UI rewrites or features changing the workflow or worse, compatibility. I myself write everthing in vim, from simple notes, latex documents, html documents, programs (even if programming languages offer their own file format) etc. VIM is now 25 years old. It has improved quite a bit. A decade ago, I started to warm up for syntax coloring.
10-01-2017: Some pictures showing the strange green spot. Obtained in Panoramas. Looks Spooky. But has an explanation. Just strange that in that case, the flares always appeared to come from the same spot in the meadow....
06-01-2017: Just upgraded the phone. A first test of the camera of the iphone 7 in the Boston Library Compare the pictures in Blockisland which were done with the iphone 6 and still had 40 Megapixel. The new panoramas have 60 megapixel.

Oliver Knill, Department of Mathematics, Harvard University, One Oxford Street, Cambridge, MA 02138, USA. SciCenter 432 Tel: (617) 495 5549, Email: knill@math.harvard.edu Quantum calculus blog, Twitter, Youtube, Vimeo, Linkedin, Scholar Harvard, Academia, Google plus, Google Scholar, ResearchGate, Slashdot, Ello, Webcam, Fall 2018 office hours: TBA Mon-Fri 11-12:30 AM and by appointment.