If a city with a population figure of 500,000 gets three more residents, the population figure is... 500,000, not 500,003. The reasoning is this: the original figure was accurate only to the first digit (the hundred-thousands digit). It has a finite precision, and adding a number that is smaller than the precision has no affect on the original number.
Significant figures is not the same as "number of decimal places", although many people do confuse the two.
Significant figures are needed for calculations with measured quantities. Measurements will have some degree of imprecision, and the rigor of significant figures keeps our calculations honest. The rules for significant figures are more complex (and subtle) than a simple "use 3 decimal places". The number of decimal places will vary, and some calculations may affect positions to the left of the decimal point. (As in our "city with 500,000 residents" example.)
For a better description of significant figures, see the wikipedia page.
Applications such as Microsoft Excel (or LibreOffice Calc) have no built-in support for significant figures. Nor, to my knowledge, are there plug-ins or extensions to support calculations with significant figures.
Perhaps the lack of support for significant figures is caused by a lack of demand. Most spreadsheets are built to handle money, which is counted (not measured) and therefore does not fall under the domain of significant figures. (Monetary figures are considered to be exact, in most applications.)
Perhaps the lack of support is driven by the confusion between "decimal places" and "significant figures".
But perhaps the biggest reason for a lack of support for significant figures in applications is this: There is no support for significant figures in popular languages.
A quick search for C++, Java, Python, and Ruby yield no such corresponding packages. Interestingly, the only language that had a package for significant figures was Perl: CPAN has the Math::SigFigs package.
So the better question is: Why do programming languages have no support for significant figures?
Thursday, November 10, 2011
Tuesday, November 8, 2011
Advertisements in the brave new world of Swipeville
We've seen the different types of ads in Webland: banner ads, side-bar ads, in-text ads, pop-up ads, pop-under ads... all of the types.
My favorite are the "your page is loading" ads which block the (completely loaded, who do you think you are kidding) content page. I like them because, with multi-tab browsers, I can view a different page while the ad-covered page is "loading" while the advertisement times out. With multiple tabs, I can avoid the delay and essentially skip the advertisement.
This all changes in the world of phones and tablets. (I call this new world "Swipeville".) Classic desktop browsers gave us tabbed windows; the new platforms do not. The phone and tablet browsers have one window and one "tab", much like the early desktop browsers.
In this world, we cannot escape advertisements by using multiple tabs. Nor can we look at another window (such as another application) while our page "loads". Since apps own the entire screen, they are either running or not -- switching to another app means that the browser stops, and switching back re-runs the page load/draw operation.
Which means that advertisements will be less avoidable, and therefore (possibly) more effective.
Or they may be less effective; the psychology of cell phone ads is, I think, poorly understood. Regardless of effectiveness, we will be seeing more of them.
My favorite are the "your page is loading" ads which block the (completely loaded, who do you think you are kidding) content page. I like them because, with multi-tab browsers, I can view a different page while the ad-covered page is "loading" while the advertisement times out. With multiple tabs, I can avoid the delay and essentially skip the advertisement.
This all changes in the world of phones and tablets. (I call this new world "Swipeville".) Classic desktop browsers gave us tabbed windows; the new platforms do not. The phone and tablet browsers have one window and one "tab", much like the early desktop browsers.
In this world, we cannot escape advertisements by using multiple tabs. Nor can we look at another window (such as another application) while our page "loads". Since apps own the entire screen, they are either running or not -- switching to another app means that the browser stops, and switching back re-runs the page load/draw operation.
Which means that advertisements will be less avoidable, and therefore (possibly) more effective.
Or they may be less effective; the psychology of cell phone ads is, I think, poorly understood. Regardless of effectiveness, we will be seeing more of them.
Sunday, November 6, 2011
Picking a programming language
In the great, ongoing debate of language superiority, many factors are considered ... and brandished. The discussions of languages are sometimes heated. My purpose here is to provide some musings in a cool light.
The popular languages of the day are (in order provided by Tiobe Software): Java, C, C++, PHP, C#, Objective C, Visual Basic, Python, Perl, and Javascript.
But instead of arguing about the sequence of these languages (or even other candidates for inclusion), let's look at the attributes that make languages popular. Here's a list of some considerations:
How well do languages match these criteria? Let's try some free association.
For performance, the language of choice is C++. Some might argue that Objective-C provides better performance, but I think the argument would come only from developers in the OSX and iOS platforms.
Readability is a difficult notion, and subject to a lot of, well, subjectivity. My impression is that most programmers will claim that their favorite language is eminently readable, if only one takes the time to learn it. To get around this bias, I propose that people will pick as second-best in readability the language Python, and I choose that as the most readable language.
I submit that reliability among languages is a neutral item. Compilers and interpreters for all of these languages are quite good, and programs perform -- for the most part -- consistently.
For cost, all of these languages are available in no-cost options. There are commercial versions for C# (Microsoft's Visual Studio) and Objective-C (Apple's developer kit), and one would think that such costs would give boosts to the other languages. And it does, but cost alone is not enough to "make" or "break" a language. Which brings us to market support.
The support of Microsoft and Apple for C# and Objective-C make those languages appealing. The Microsoft tools have a lot of followers: companies that specify them as standards and developers who know and keep active in the C# language.
Peering into the future, what can we see?
I think that the Perl/Python tussle will end up going to Python. Right now, Perl has better market support: the CPAN libraries and lots of developers. These factors can change, and are changing. O'Reilly has been printing (and selling) lots of books on Python. People have been starting projects in Python. In contrast, Perl loses on readability, something that is hard to change.
The Java/C# tussle is more about market support and less about readability and performance. These languages are about the same in readability, performance, and reliability. Microsoft has made C# the prince of languages for development in Windows; we need to see what Oracle will do with Java.
Apple had designated Objective-C, C, and C++ as the only languages suitable for iOS, but is relaxing their rules. I expect some change in the popularity of iOS programming languages.
But what about those other popular languages, the ones I have not mentioned? What about C, Visual Basic, PHP, and Javascript? Each have their fanbase (companies and developers) and each have a fair rating in performance, reliability, and market support.
I expect that Javascript will become more popular, continuing the current trend. The others I think will fade gradually. Expect to see less market support (fewer books, fewer updates to tools) and more conversion projects (from Visual Basic to C#, for example). But also expect a long life from these languages. The old languages of Fortran and COBOL are still with us.
Which language you pick for your project is a choice that you should make consciously. You must weigh many factors -- more than are listed here -- and live with the consequences of that decision. I encourage you to think of these factors, think of other factors, and discuss them with your colleagues.
The popular languages of the day are (in order provided by Tiobe Software): Java, C, C++, PHP, C#, Objective C, Visual Basic, Python, Perl, and Javascript.
But instead of arguing about the sequence of these languages (or even other candidates for inclusion), let's look at the attributes that make languages popular. Here's a list of some considerations:
- Platform: which platforms (Windows, OSX, iOS, Android, Linux) support the language
- Performance: how well the programs perform at run-time (whether compiled or interpreted)
- Readability: how well programs written by programmers can be read by other programmers
- Reliability: how consistently the written programs perform
- Cost: here I mean direct costs: the cost of the compiler and tools (and ongoing costs for support and licenses)
- Market support: how much support is available from vendors, groups, and developers
How well do languages match these criteria? Let's try some free association.
For performance, the language of choice is C++. Some might argue that Objective-C provides better performance, but I think the argument would come only from developers in the OSX and iOS platforms.
Readability is a difficult notion, and subject to a lot of, well, subjectivity. My impression is that most programmers will claim that their favorite language is eminently readable, if only one takes the time to learn it. To get around this bias, I propose that people will pick as second-best in readability the language Python, and I choose that as the most readable language.
I submit that reliability among languages is a neutral item. Compilers and interpreters for all of these languages are quite good, and programs perform -- for the most part -- consistently.
For cost, all of these languages are available in no-cost options. There are commercial versions for C# (Microsoft's Visual Studio) and Objective-C (Apple's developer kit), and one would think that such costs would give boosts to the other languages. And it does, but cost alone is not enough to "make" or "break" a language. Which brings us to market support.
The support of Microsoft and Apple for C# and Objective-C make those languages appealing. The Microsoft tools have a lot of followers: companies that specify them as standards and developers who know and keep active in the C# language.
Peering into the future, what can we see?
I think that the Perl/Python tussle will end up going to Python. Right now, Perl has better market support: the CPAN libraries and lots of developers. These factors can change, and are changing. O'Reilly has been printing (and selling) lots of books on Python. People have been starting projects in Python. In contrast, Perl loses on readability, something that is hard to change.
The Java/C# tussle is more about market support and less about readability and performance. These languages are about the same in readability, performance, and reliability. Microsoft has made C# the prince of languages for development in Windows; we need to see what Oracle will do with Java.
Apple had designated Objective-C, C, and C++ as the only languages suitable for iOS, but is relaxing their rules. I expect some change in the popularity of iOS programming languages.
But what about those other popular languages, the ones I have not mentioned? What about C, Visual Basic, PHP, and Javascript? Each have their fanbase (companies and developers) and each have a fair rating in performance, reliability, and market support.
I expect that Javascript will become more popular, continuing the current trend. The others I think will fade gradually. Expect to see less market support (fewer books, fewer updates to tools) and more conversion projects (from Visual Basic to C#, for example). But also expect a long life from these languages. The old languages of Fortran and COBOL are still with us.
Which language you pick for your project is a choice that you should make consciously. You must weigh many factors -- more than are listed here -- and live with the consequences of that decision. I encourage you to think of these factors, think of other factors, and discuss them with your colleagues.
Tuesday, November 1, 2011
Mobile first, desktop second (maybe)
Mobile computing has arrived, and is no longer a second-class citizen. In fact, it is the desktop that may be the second-class application.
A long time ago, desktop applications were the only game in town. Then mobile arrived, and it was granted a small presence: usually m.whatever.com, with some custom scripts to generate a limited set of web pages.
Now, the mobile app is the leader. If you are starting a project, start with mobile, and if you have time, build in the "plain" version later. Focus on your customers; for new apps, customers are mobile devices: iPhones, iPads, Android phones, and tablets. You can add the desktop browser version later, after you get the core running.
A long time ago, desktop applications were the only game in town. Then mobile arrived, and it was granted a small presence: usually m.whatever.com, with some custom scripts to generate a limited set of web pages.
Now, the mobile app is the leader. If you are starting a project, start with mobile, and if you have time, build in the "plain" version later. Focus on your customers; for new apps, customers are mobile devices: iPhones, iPads, Android phones, and tablets. You can add the desktop browser version later, after you get the core running.
Monday, October 31, 2011
Bring your own device
The typical policy for corporate networks is simple: corporation-supplied equipment is allowed, and everything else is forbidden. Do not attach your own computers or cell phones, do not connect your own tablet computers, do not plug in your own thumb drives. Only corporate-approved (and corporate-supplied) equipment is allowed, because that enables security.
The typical policy for corporate networks is changing.
This change has been brought about by reality. Corporations cannot keep up with the plethora of devices available (iPods, iPads, Android phones, tablets, ... what have you...) but must improve efficiency of their employees. New devices improve that efficiency.
In the struggle between security and efficiency... the winner is efficiency.
IBM is allowing employees to attach their own equipment to the corporate network. This makes sense for IBM, since they advise other companies in the effective use of resources. IBM *has* to make this work, in order for them to retain credibility. After all, if IBM cannot make this work, they cannot counsel other companies and advise that those companies open their networks to employee-owned equipment.
Non-consulting corporations (that is, most corporations) don't have the pressure to make this change. They can choose to keep their networks "pure" and free from non-approved equipment.
For a while.
Instead of marketing pressure, companies will face pressure from within. It will come from new hires, who expect to use their smartphones and tablets. It will come from "average" employees, who want to use readily-available equipment to get the job done.
More and more, people within the company will question the rules put in place by the IT group, rules that limit their choices of hardware.
And once "alien" hardware is approved, software will follow. At first, the software will be the operating systems and closely-bound utilities (Mac OSX and iTunes, for example). Eventually, the demand for other utilities (Google Docs, Google App Engine, Python) will overwhelm the IT forces holding back the tide.
IT can approach this change with grace, or with resistance. But face it they will, and adjust to it they must.
The typical policy for corporate networks is changing.
This change has been brought about by reality. Corporations cannot keep up with the plethora of devices available (iPods, iPads, Android phones, tablets, ... what have you...) but must improve efficiency of their employees. New devices improve that efficiency.
In the struggle between security and efficiency... the winner is efficiency.
IBM is allowing employees to attach their own equipment to the corporate network. This makes sense for IBM, since they advise other companies in the effective use of resources. IBM *has* to make this work, in order for them to retain credibility. After all, if IBM cannot make this work, they cannot counsel other companies and advise that those companies open their networks to employee-owned equipment.
Non-consulting corporations (that is, most corporations) don't have the pressure to make this change. They can choose to keep their networks "pure" and free from non-approved equipment.
For a while.
Instead of marketing pressure, companies will face pressure from within. It will come from new hires, who expect to use their smartphones and tablets. It will come from "average" employees, who want to use readily-available equipment to get the job done.
More and more, people within the company will question the rules put in place by the IT group, rules that limit their choices of hardware.
And once "alien" hardware is approved, software will follow. At first, the software will be the operating systems and closely-bound utilities (Mac OSX and iTunes, for example). Eventually, the demand for other utilities (Google Docs, Google App Engine, Python) will overwhelm the IT forces holding back the tide.
IT can approach this change with grace, or with resistance. But face it they will, and adjust to it they must.
Wednesday, October 26, 2011
Small is the new big thing
Applications are big, out of necessity. Apps are small, and should be.
Applications are programs that do everything you need. Microsoft Word and Microsoft Excel are applications: They let you compose documents (or spreadsheets), manipulate them, and store them. Visual Studio is an application: It lets you compose programs, compile them, and test them. Everything you need is baked into the application, except for the low-level functionality provided by the operating system.
Apps, in contrast, contain just enough logic to get the desired data and present it to the user.
A smartphone app is not a complete application; except for the most trivial of programs, it is the user interface to an application.
The Facebook app is a small program that talks to Facebook servers and presents data. Twitter apps talk to the Twitter servers. The New York Times talks to their servers. Simple apps such as a calculator app or rudimentary games can run without back-ends, but I suspect that popular games like "Angry Birds" store data on servers.
Applications contained everything: core logic, user interface, and data storage. Apps are components in a larger system.
We've seen distributed systems before: client-server systems and web applications divide data storage and core logic from user interface and validation logic. These application designs allowed for a single front-end; current system design allows for multiple user interfaces: iPhone, iPad, Android, and web. Multiple front ends are necessary; there is no clear leader, no "IBM PC" standard.
To omit a popular platform is to walk away from business.
Small front ends are better than large front ends. A small, simple front end can be ported quickly to new platforms. It can be updated more rapidly, to stay competitive. Large, complex apps can be ported to new platforms, but as with everything else, a large program requires more effort to port.
Small apps allow a company to move quickly to new platforms.
With a dynamic market of user interface devices, an effective company must adopt new platforms or face reduced revenue. Small user interfaces (apps) allow a company to quickly adopt new platforms.
If you want to succeed, think small.
Applications are programs that do everything you need. Microsoft Word and Microsoft Excel are applications: They let you compose documents (or spreadsheets), manipulate them, and store them. Visual Studio is an application: It lets you compose programs, compile them, and test them. Everything you need is baked into the application, except for the low-level functionality provided by the operating system.
Apps, in contrast, contain just enough logic to get the desired data and present it to the user.
A smartphone app is not a complete application; except for the most trivial of programs, it is the user interface to an application.
The Facebook app is a small program that talks to Facebook servers and presents data. Twitter apps talk to the Twitter servers. The New York Times talks to their servers. Simple apps such as a calculator app or rudimentary games can run without back-ends, but I suspect that popular games like "Angry Birds" store data on servers.
Applications contained everything: core logic, user interface, and data storage. Apps are components in a larger system.
We've seen distributed systems before: client-server systems and web applications divide data storage and core logic from user interface and validation logic. These application designs allowed for a single front-end; current system design allows for multiple user interfaces: iPhone, iPad, Android, and web. Multiple front ends are necessary; there is no clear leader, no "IBM PC" standard.
To omit a popular platform is to walk away from business.
Small front ends are better than large front ends. A small, simple front end can be ported quickly to new platforms. It can be updated more rapidly, to stay competitive. Large, complex apps can be ported to new platforms, but as with everything else, a large program requires more effort to port.
Small apps allow a company to move quickly to new platforms.
With a dynamic market of user interface devices, an effective company must adopt new platforms or face reduced revenue. Small user interfaces (apps) allow a company to quickly adopt new platforms.
If you want to succeed, think small.
Monday, October 24, 2011
Steve Jobs, Dennis Ritchie, John McCarthy, and Daniel McCracken
We lost four significant people from the computing world this year.
Steve Jobs needed no introduction. Everyone new him as that slightly crazy guy from Apple, the one who would show off new products while always wearing a black mock-turtleneck shirt.
Dennis Ritchie was well-known by the geeks. Articles comparing him to Steve Jobs were wrong: Ritchie co-created Unix and C somewhat before Steve Jobs founded Apple. Many languages (C++, Java, C#) are descendants of C. Linux, Android, Apple iOS, and Apple OSX are descendants of Unix.
John McCarthy was know by the true geeks. He built a lot of AI, and created a language called LISP. Modern languages (Python, Ruby, Scala, and even C# and C++) are beginning to incorporate ideas from the LISP language.
Daniel McCracken is the unsung hero of the group. He is unknown even among true geeks. His work predates the others (except McCarthy), and had a greater influence on the industry than possibly all of them. McCracken wrote books on FORTRAN and COBOL, books that were understandable and comprehensive. He made it possible for the very early programmers to learn their craft -- not just the syntax but the craft of programming.
The next time you write a "for" loop with the control variable named "i", or see a "for" loop with the control variable named "i", you can thank Daniel McCracken. It was his work that set that convention and taught the first set of programmers.
Steve Jobs needed no introduction. Everyone new him as that slightly crazy guy from Apple, the one who would show off new products while always wearing a black mock-turtleneck shirt.
Dennis Ritchie was well-known by the geeks. Articles comparing him to Steve Jobs were wrong: Ritchie co-created Unix and C somewhat before Steve Jobs founded Apple. Many languages (C++, Java, C#) are descendants of C. Linux, Android, Apple iOS, and Apple OSX are descendants of Unix.
John McCarthy was know by the true geeks. He built a lot of AI, and created a language called LISP. Modern languages (Python, Ruby, Scala, and even C# and C++) are beginning to incorporate ideas from the LISP language.
Daniel McCracken is the unsung hero of the group. He is unknown even among true geeks. His work predates the others (except McCarthy), and had a greater influence on the industry than possibly all of them. McCracken wrote books on FORTRAN and COBOL, books that were understandable and comprehensive. He made it possible for the very early programmers to learn their craft -- not just the syntax but the craft of programming.
The next time you write a "for" loop with the control variable named "i", or see a "for" loop with the control variable named "i", you can thank Daniel McCracken. It was his work that set that convention and taught the first set of programmers.
Labels:
Apple,
books,
C,
COBOL,
Daniel McCracken,
Dennis Ritchie,
Fortran,
John McCarthy,
LISP,
steve jobs,
Unix,
unsung hero
Subscribe to:
Posts (Atom)