Saturday, June 29, 2019

A Better User Experience -- The Reason The Internet Is Spying On You


Open a newspaper, turn on the news, open nearly any social media application or listen to radio or podcasts...it's nearly impossible to avoid someone, somewhere touting the sinister collection and usage of user data.  The latest boogieman of modern society, the worlds master of ceremony drags this villain on stage to be pelted with tomatoes by an enthusiastic crowd of participants.  Politicians, seeing an opportunity capitalize on outrage, often overlooking the hypocrisy of mandating the participation of surveys officiated by the Census Bureau.

User privacy is an incredibly complicated topic, far too complicated to even touch on.  Rather, I'd submit to the 2+ possible readers of this post who stumbled upon this during a particularly boring company meeting, insomnia-fueled browsing, or particularly long bowel movement, that privacy is an individual choice and should be an informed decision.  Focused on those who may not consider themselves tech-savvy, the remainder of the post will attempt to focus on 'why your phone/computer knows so much about you', the likely motivations from a technical perspective.

Money

Despite the popular 'evil corporation trope' I personally don't feel there is a legion of business leaders tenting their fingers around Hitler's well-preserved French-inspired boardroom conference table and plotting how they can pillage the remaining ounces of user data from the peons.  Instead, you have companies ranging from start-ups to billion-dollar organizations trying to figure out how to fund their operations.  For example; Google's Alphabet reported $39.3 billion in 2018 revenue, $9.19 billion in net revenue which leave ~$30 billion in annual operating expenses.  Droves of engineers, scientists, marketing personnel and others require salaries.  Multi-million dollar server farms require equipment.  Monthly utility costs alone are likely beyond the capacity of understanding for folks like you and me.  Big Daddy's gotta eat and companies like this found a way to create revenue from something seemingly worthless at the time.  With approximately 2 billion active devices, do a little simple math; that's $15 / device to cover Google's annual expenses.  Now, count the number of devices (phones, tablets, computers,...) you have in your family and tally it up.  Would you consider cutting an annual check to Google for that amount?  Yes....cool, calculate the same for all the other providers you use; Reddit, Twitter, Snapchat, Instagram, Facebook....inventory your phone.....I'll wait.
User data is obviously big business, perform some mental gymnastics and consider alternatives to funding such technical powerhouses.  Personally, Google alone provides me more benefits that local and federal government on a daily basis but I wouldn't tolerate an equivalently sized tax expense to retain such tech and out-of-pocket expenses would likely be comparable.  Alternatively, we could return to the days of physical maps, encyclopedias, pre-web and newspapers. ¯\_(ツ)_/¯

My apologies, I got a bit long-winded and certainly preachy.  Please focus on my intended point; these revolutionary technologies cost money to build and maintain, people have demonstrated a reluctance to pay market value and user data has stepped in to fill that void.  If user data currency were to leave, the tech would follow or the monthly expenses would somehow make their way to the consumer, anyone who says otherwise is living in an alternative reality.

Language

Let's start light, your computer and Internet applications know your preferred language?  The reason is likely apparent, but pretend for a moment it didn't.  With 1.2 Billion population in China compared to a mere 300 million in the US, how useful would it be if the majority of your web content was provided in Mandarin?  I've personally witnessed folks throwing a hissy-fit when an ATM requests preferred language on the primary screen.  Selecting a language is an additional stumbling block in accomplishing your goal, getting relevant information.  When web applications know your preferred language and the language of the content it can even offer to automatically translate the content to your preferred language, auto-magically.

Location

But why does the Internet need to know my current location?  For those of you that didn't live through the era of the floppy disk you may not have lived through the birth of the Internet.  In the pre-Cambrian era of technology, the Internet was primitive and small.  You could navigate it in the same way you conducted a search in your local library....dig through a card catalog and find information local to your community library.  As this virtual library (our Internet metaphor) scaled to a massive level a better way to peruse it became not only desired....but essential.  Location could easily be viewed as a primary descriptor of finding relevant content for users.  When searching for the Department of Commerce information you're likely interested in the one related to the state in which you currently reside.  Searching for 'hospital' likely wouldn't satisfy your true goals if it returned results on the other end of the country.  If you're inquiring about the 'business hours of Applebees', you're likely interested in whether you have time to grab a late-night cocktail, not the hours from another state and time-zone.  During the pre-Cambrian Era of tech you had to constrain ALL your searches with location when it was relevant.
But just for fun, try doing without; if you have access to a VPN, select a site on the other side of the country (or world) so it's best guess to your location is not representative of where you really are.  Next, open an incognito window to avoid location awareness.  Do that for a day, quantifying all searches with your current city/state and measure your frustration.  Without location you're Internet life/searches becomes much, much less user friendly; functional, but far less convenient.
Exploring local dining and event options without your current location pushes a good deal of complexity onto the user.  Suppose you're visiting a new city, rife with suburbs and are looking for a quiet coffee house.  Perhaps you're a well-equipped traveler with knowledge of the surrounding suburbs and prepared for conducting a series of searches for coffee shops in each so you can select the best suit to your particular taste.
I'm intentionally going to leave out vehicle and walking navigation as I feel that is a pretty apparent need for location and I haven't met anyone who would be enthusiastic to returning to the days of paper maps.

Search/Watch/Navigation History


I'll sprint to the table for a conversation about how much I hate unsolicited advertising.  Today's USPS mail-box equivalent, bursting with personalized 'offers' and ads while still prominent has been augmented with shotgun-style advertising via e-mail and other web presences.  Big Daddy's still got to eat, and I'm significantly more tolerant to advertisements for products I may actually use.  Call it targeted advertising, tattoo it with maliciousness, but honestly....would you rather have advertised products that align with your interests or one's of day's past....2-for-1 diaper deals (infant or adult) when you're childless, window installer fliers to renters,... I'll gladly exchange irrelevant ads for one's that are applicable to things I want or need.  Still hate ads, but hate targeted ones less.

Intelligence baked into navigation history provides benefits, how else would you know from the bazillions of Internet cat videos which one(s) you'd like to watch?  Watch 'Corner Gas', let me recommend 'Letterkenny'; exposing you to series that you otherwise may never have seen.  Just bought a Toro 24" mower, what if next season you're prompted with the filter/plug replacement parts rather than have to remember and look up the parts?  Suggestions provided by 'the algorithm' sounds ominous, but just try to recall what it was like before Barnes and Nobles suggested authors you may like based on your previous readings, what it was like that every news outlet published a generalized bundle of the masses.  Advertising and knowledge distribution today views you more as an individual than ever before, recognize the blessings of this.

Device Info

Capturing device information and usage provides the holy grail of support and market-based business decisions.  Consider the US Census again; why does it exist?  Is it a plot to expose all our individual secrets?  Or is it an essential metric to understand where the nation can/should invest?  Businesses armed with a key understanding of how users are using their products, what they use, what they don't is an essential part of determining where to continue investing.  Prior to user-based metrics business leaders took the word of third-party personnel (sales folk, politicians, and product domain mentalists) or sub-sampling users with surveys and extrapolating the results to make business decisions.  Or, worse, they simply just guessed.  Every button you click, every scroll pane you see, every screen and graphic costs money to build and maintain.  Every release, countless features are exhaustively tested by an army of testers.  Identifying obsolete features is essential in discontinuing unneeded/unwanted features, without which you indiscriminately spend on unneeded products (e.g. https://www.smithsonianmag.com/smart-news/fda-used-have-people-whose-job-was-taste-tea-180967545/)

Today, leaders are armed with data to make data-driven decisions; exponential growth...better invest in personnel and equipment 'cause the flood of users is coming, exponential shrinkage means the feature/product may be on a dying path.

Birthdate

With exception to a mere handful of family members, I generally don't remember birthdays and consider it a blessing that my phone reminds me when my amazing neices/nephews have a birthday due.  Despite the cesspool that the Internet can be, it has the ability of bringing us together.  Recognizing and celebrating a single day to show someone how you feel is a worthwhile step in the right direction.  The interests of a 20-something and a 60-something dramatically differ, that knowledge allows providing relevant info to you.

Contacts

Verbal commands have introduced a new element in how we interact with systems.  Pop-quiz; what's the telephone number of your sister/brother/buddy?  Chances are, you're more likely to remember the capital of North Dakota over that phone number.  Without associating a name and phone number, the purpose of your contacts, you'd be left with 'Ok Google, call 952-555-1212' or doing without verbal command conveniences.  Ditto for e-mail, sms and such.  Third-party applications request access to your contacts to allow sharing via SMS, e-mail and such.

Closing

User privacy is certainly complex subject and like your social security and bank accounts you should protect them accordingly.  A good deal of attention has been paid on why/how to protect user privacy but seldom speak to why the information is being gathered and/or used.  Fear sells, so a good chunk of media attention is on the potential malicious use of the information.  Having spent going on 3 decades in the software engineering industry I can speak with some confidence that there exists no droves of shady software engineers hiding in the shadows writing applications with the sole intention of stealing your semi-sensitive information with sole purpose of mischief.  Instead, many conveniences and good user experiences are build upon this information, making things more convenient and usable.  The primary goal of the Internet is (or should be) to get relevant information as easily and quickly as possible.  Searching for the closing hours for that boutique French bakery down the street which name has currently escaped you shouldn't require "french bakery english language st paul mn united states", instead 'french bakery' simply returns Trung Nam French Bakery in the key return list.  Less typing, more croissants.

Finally, I'm not advocating simply hammering on the 'accept' button and freely give away user privacy to any Tom, Dick and Harry.  I value my data privacy as much as probably most, but not understanding why or how it's being used is part of an informed decision.

Now, go find something cool.

Sunday, June 23, 2019

My Software Has Tassels -- Clean Out Your Software Garage


All software engineers, as well as many other disciplines, at some point in time discover features, elements, or entire subsystems that are unused, unwanted and unnecessary.  While there are likely a good number of reasons for this, I thought I'd outline and discuss some reasons I've encountered, perhaps ones you may have encountered before.

Some of these turds are simply our own doing, others driven from external influences of the team.  

Why Get Rid Of Unnecessary Software

While there are just shy of a bazillion notable reasons dead code should be exorcised like a demonic possession, I'll only hit on a few before moving on to how they may have got there in the first place.

Expensive Scavenger Hunt

Few things are more frustrating and costly than to be investigating a bit, or massive amount, of code in the attempt to understand what it's intended purpose is only to later find it is never called/used/wanted...  That, my friend, is a real kick in the knackers and just about enough to bring an otherwise collected professional to tears or rage.  This typically isn't a one-man/women show either, often it involves reaching out to various team members which balloons into a series of suggested 'you should talk to ...' domino chain.  By the end, these non-productive Easter Egg hunts cost you money and time, two things that come in short supply these days.

Documentation Recurring Costs

Like that pyramid of dried out paint cans every homeowner has in their garage keeping clutter costs you money, directly or indirectly.  Unused features still require, or should require, documentation and support efforts.  Documentation evolves no differently than the product itself and as the documentation is updated the cost of associated unused features comes along for the ride.  Like sanding a block of wood before tossing in the fireplace, that's time/effort completely lost.  Some will simply take the tact of not updating documentation revolving around unused features and that is just denial, if you can't take pride in your product and half-azz it how do you expect your users to?  Good documentation should flow, have consistency and continuity.  Simply ignoring sections is a cop out for making the right and tougher decision and still will cost you money.  Uncommon features and controls will eventually be found by curious users, resulting in questions and support queries.  Hide them away, like your children's Christmas presents and they are still sure to be found.

Testing Recurring Costs

Perhaps you're lucky and the code-base for unneeded software never changes.  Then, your testing costs are pretty limited; unit test time, pure automated tests....cpu cycles come cheap.  I've never, in my 20+ years, worked for a company that didn't have extensive manual testing efforts.  Even if not a single line of feature code changes, there is still risk the feature no longer exists.  Indirect dependencies; libraries, data feeds, latencies, operating system changes...these can easily impact feature functionality even if the feature code itself never changed; the whole reason you have a QA and Testing departments.  "We just won't test the unused features"; same band, different tune...same cop out.
Worse, suppose the feature code changes, perhaps due to replacing a core library that sprinkles itself throughout the system.  Polish that firewood.

Development Recurring Costs

Direct development recurring costs are pretty simple, hours x rate = buttload of money.  If you're steadfast in keeping unwanted features then it's just desserts that your corporate wallet takes a hit.  Indirect costs can be on team morale.  Good teams take pride in their work and you can visibly see the pride drain from their eyes when they find that they'll be working on an unimportant feature.  It sends a message, that their time isn't worth doing meaningless tasks.

Pride In Your Product

I'd argue that one of the greatest motivators for a team is taking pride in their product, not the schedule, not the budget, not the revenue...  'Purpose' is the latest lingo and is perhaps the strongest separator between a 'job' and a 'career'.  Clean and elegant rather than clumsy and complicated; unneeded feature-creep results in a clumsy product, and soon will take on a smell like rancid meat.

How'd It Get In There In The First Place?

Proof-Of-Concept Retention

"Beth, I need you to prototype XXX"; that's how the activity normally begins.  Prototyping is necessary for any number of reasons: establishing market interest, generating a rough development effort, marrying two independent systems... The promise is always the same, stitch in a prototype and we'll refactor/redesign it once we know we want to move forward with it.  Strong leaders hold up their end of that promise, lesser ones won't.  Worse, if the proof-of-concept is deemed unwanted; good leaders will orchestrate the removal of the design, lesser ones won't.

Sales Feature Request

Sales folks sole purpose is to drive sales, bring in revenue and seek commissions; period, full-stop.  No cash, no company.

If you've ever been frustrated standing behind a customer in a fast-food restaurant who simply can't make up their mind then you've witnessed the collective whole of customers sales folks deal with daily.  As a whole, people are indecisive about what happy-meal they want, imaging how much uncertainty they have in buying a product 10x, 100x, 1000x in cost.

In fast-food speak, customers think they want a quarter pounder but not with the condiments you have...instead they want Mango salsa.  So, the cashier rushes out the door, jumps in their car, drives to the nearest grocery store, finds they don't have it, drives to another.....finally returning and adds this new condiment to a freshly prepared burger...bags it, rings it up and along the way the customer changes their mind and decides they don't want a burger.  The customer doesn't have any skin in the game and can back out with limited consequences.

"Boy, we can sell the crap out of this in [Russia, France, India, China,...]", it only needs this one thing.....  Then, the market is found bare and the feature remains as a cruel ongoing reminder of a feature failure, always.....ALWAYS....with a promise "it'll take hold, give it some time".

Contractual Obligation

You're a young man venturing into the lawn cutting business after school.  You've got no customers, endless energy and a drive to make this new venture work.  After encountering no after no after no, weary and desperate you approach homeowner that is willing to purchase your services....with one condition....you use a push reel mower.  As your greens-keeping empire grows and your relationship with this customer continues....there will be a day when this constraint will begin costing you more than it's worth.  Loyalty, contractual agreements or simply a gentleman's handshake can keep you maintaining features that no longer make strategic sense.  Early customers often find a willing and desperate team that is willing to specialize their product simply to get a sale/relationship.  Cutting the cord is difficult at best.

Reimbursed Feature

You hear horror stories; a customer kicks in a modest financial cost-share for a feature development effort.  Then, an eon later the feature remains, sometimes isn't even used but wary to be removed.  Removal would take a discussion, perhaps a buy-back and likely a complicated endeavor so simply avoided. Still a wart, hopefully the financial kick-in was substantial but often is modest, and certainly not in the ballpark of the on-going un-ending recurring costs to maintain.

'Spec Out'

I was unfamiliar with this term until I worked in the DOT commercial sector, but I suspect it's relevant elsewhere.

While perhaps well-intended, regulations and constraints often are misused...it's human nature.  If you're a 'true blue Ford aficionado' and know....deep in your soul....that Fords are a superior vehicle you'll feel a sense of responsibility to procure these bullet-proof vehicles for your fleet.  Meanwhile, every vehicle manufacturer wants a seat at that table.

So, you use product requirements to your advantage.  Say Fords have a slightly better fuel efficiency....you add that to your product requirements; 'must provide 20+ mpg'.  You can justify that, fuel costs are substantial for your fleet.....

To be "spec'd out" is just slang for not meeting the requirements of the product requirements/specifications.  It's a cat/mouse game of vendor/purchaser and can be a contributor to unneeded/unwanted/incomplete features.  The vendor-preferred product X has a web-interface, a web-interface is called out in the spec....we need a web interface even if the users never want or use it.

Regardless of whether the feature is used or needed, features can be required simply to get a seat at the table.  Worse...many of these features are never really wanted and when that's known the emphasis on the feature is "something", sometimes without even caring if the feature even works. 

The cynical side of me feels the ever-growing specification is intentional, but whispers of an alternative reasons drop in occasionally.  When you're selecting a tool for your team, 'more' can be mistaken for 'better' and as a key decision maker you want the best for your team.  A Swiss Army knife with 12 functions is better than one with 8...right?  Unless you know that 4 of the additional functions are of no use to your large and diverse team...isn't it safer to pick the fuller-featured one?  Who has the time to survey their team with what they really need...just get 'em the bigger one.

Closing

Likely stuff you already know and as you see I offer no detailed solutions.  Often the motivation for retaining dead code is a couple levels up in the org chart, the consequences left to be dealt with by the engineering oompa loompas. 
/shrug

Sunday, June 16, 2019

Do You Trust It?

I've had this topic on my todo list for since Dec 5th, 2016.  My motivation was aimed at past experiences but particularly timely given Boeing 737 Max fiasco.

Trust is a complicated subject and one of the most powerful demonstration of trust in a person or product is when you place your life on the line.  That greatly simplifies a response, it becomes binary and I'd argue that is the true measure of trust in a person or thing.

The first time I entertained the question 'do I trust this thing' from an engineering perspective was when I was working for a defense contractor who specialized in designing weapon systems.  Throughout the development stages the product goes through a metamorphosis of stability.  Early stages when swinging equipment you stay 30 yards away.  Slowly your confidence builds, the risky bits of the system are addressed and you step a bit closer.  Closer, and closer until you trust the system with the same level of confidence that the soldiers will need to.

Misguided trust could be catastrophic;
Like the untimely death of Garry Hoy, a lawyer who emphatically trusted the installation of hi-rise office building windows, misplaced trust can end tragically; https://en.wikipedia.org/wiki/Death_of_Garry_Hoy

So I always wondered....how many of these folks trust this system enough to put their life on the line?  Would I?

Enter Boeing.  I'm sure the 737 root cause can be found, and likely remedied.  Fast-forward a few years from now.  You're making you way down the jetway bridge to your awaiting plane, you make it to your seat and begin settling in.  Like every good passenger, you refer to the safety brochure and your gaze is caught by '737 Max'.  Where are you now?  Still seated or is there a cartoon-style smoke outline of where you once were while your physical body is approaching the sound barrier back to the rotunda?

Assuming the plane isn't decommissioned indefinitely, what would it take for you to trust it not to fall from the sky once again?

Would the Boeing board-of-directors climb into a randomly selected aircraft for a series of stops around the world?  Would they trust the fix with their lives?  How about the engineers?  Would the FAA be willing to take the top tier of their organizational chart and take a trip to Vegas on it?

I certainly hope so, and if not but it's released to the rest of the world I think it would speak volumes on how they view us common folk.

I hope we all trust the products we are designing, and if not, we really should.

Sunday, June 9, 2019

Building Gtk Applications with the GNU Build System


Brief

In training (Qs-GnuBuildEnvironment) we discussed the rationale and means of using the Gnu Build System in application development and deployment. This training extends on that training to include instructions on how to utilize this build environment in the development of Gtk applications.

Introduction

The GNU Build System (also known as autotools) is a suite of tools authored by Gnu used to make portable software products. The build system includes two significant programs: automake and autoconf. The automake utility is used in the generation of makefiles from a Makefile.am template. The autoconf utility provides a set of utilities that help to generate the configure script. Together, these utilities provide the means of interrogating the target system and generating a target-specific makefile to be used to compile the application.

Before beginning, it is assumed that your system has the following packages installed:
  1. autoconf
  2. automake
  3. gcc
  4. g++
  5. pkg-config
  6. libgtk2.0-dev

Discussion

We will begin with the a simple non-Gtk application, examining the Gnu Build System configure.ac and Makefile.am inputs. Next, we will extend these files to support compiling and linking in the required Gtk includes and libraries.

Non-Gtk Application

We'll begin with a similar example to that outlined in training Qs-GnuBuildEnvironment. The example is similar but simplified to use a single source file. Without discussion we will identify the contents of the input files and abbreviated procedure for setting up the build environment.

configure.ac

AC_PREREQ(2.59)
AC_INIT(FULL-PACKAGE-NAME, VERSION, BUG-REPORT-ADDRESS)
AM_INIT_AUTOMAKE(someApp,2.1)
AC_CONFIG_HEADER([config.h])
AC_PROG_CC
AC_OUTPUT(Makefile)

Makefile.am

bin_PROGRAMS=someApp
someApp_SOURCES=myMain.c

myMain.c

#include <stdio.h>
#include <stdlib.h>

int main()
{
printf("(%s:%d) main process initializing\n",__FILE__,__LINE__);
printf("(%s:%d) main process terminating\n",__FILE__,__LINE__);
return EXIT_SUCCESS;
}

Brief Build System Setup

Given the configure.ac, Makefile.am, and myMain.c files stepping through the following procedure will result in building the someApp application.
$ aclocal
$ autoconf
$ autoheader
$ touch NEWS README AUTHORS ChangeLog
$ automake -a
$ ./configure
$ make


Gtk Application Extension

If our application is critically dependent on Gtk extending the autoconf package requirements to include Gtk will be our first step. Our configure.ac file changes to include the Gtk dependency:

configure.ac

AC_PREREQ(2.59)
AC_INIT(FULL-PACKAGE-NAME, VERSION, BUG-REPORT-ADDRESS)
AM_INIT_AUTOMAKE(someApp,2.1)
AC_CONFIG_HEADER([config.h])
AC_PROG_CC
GTK_REQUIRED_VERSION=2.0.0
PKG_CHECK_MODULES([GTK], gtk+-2.0 >= $GTK_REQUIRED_VERSION)
AC_OUTPUT(Makefile)

After making the above changes to the configure.ac file you must repeat the procedure for re-establishing the build environment.
$ aclocal
$ autoheader
$ automake –add-missing –copy
$ autoconf

Running ./configure will now execute the Gtk test rule, giving the following output:
checking for GTK... yes
If you examine the compiler outputs you will fail to find the inclusion of Gtk include path, nor Gtk library path. The Gtk include and library paths must be passed to make to accomplish our intentions.

The Makefile.am file must be extended to pass the Gtk flags and libraries on to the compiler. This is done by updating the Makefile.am as follows:

Note: if you get a “syntax error near unexpected token 'GTK,gtk+-2.0'” you'll need to install the pkg-config package.

Makefile.am

bin_PROGRAMS=someApp
someApp_SOURCES=myMain.c
myApp_LDADD=
myApp_CPPFLAGS = @GTK_CFLAGS@
myApp_LDADD += @GTK_LIBS@

Variables surrounded by @'s are automatically propagated without change to Makefile.in, the configure script makes the necessary substitutions by expanding them as appropriate.

Repeat the previous procedure since we've made modifications to the autotools input files:
$ aclocal
$ autoheader
$ automake –add-missing –copy
$ autoconf

Now, building the application will result in the GTK_CFLAGS and GTK_LIBS arguments being passed to the compiler. Inclusion of Gtk headers and references to Gtk objects can now be resolved during compilation.

Conclusion

In closing, authoring a Gnu Build Environment compliant product offers flexibility to the user by incorporating a standardized means for examining the target system and ensuring the required product dependencies are met and offers a well defined configuration and build environment when distributing your product by source. Incorporation of Gtk libraries into a simple example can be daunting when referencing man pages or elaborate examples, but can be relatively straight-forward with a simplified example.

References

  1. Introduction to the GNU Build System; FSK Consulting








Sunday, June 2, 2019

GNU Build System – Conditional Features


Brief

In training (Qs-GnuBuildEnvironment) we discussed the rationale and means of using the Gnu Build System in application development and deployment. This training extends on that training to include instructions on how to utilize this build environment in the development of more sophisticated applications where autoconf will examine the system and tailor the application dependent on the existence of module dependencies.

Introduction

The GNU Build System (also known as autotools) is a suite of tools authored by Gnu used to make portable software products. The build system includes two significant programs: automake and autoconf. The automake utility is used in the generation of makefiles from a Makefile.am template. The autoconf utility provides a set of utilities that help to generate the configure script. Together, these utilities provide the means of interrogating the target system and generating a target-specific makefile to be used to compile the application.

Before beginning, it is assumed that your system has the following packages installed:
  1. autoconf
  2. automake
  3. gcc
  4. g++
  5. pkg-config
  6. libgtk2.0-dev

Discussion


We begin with a similar example to that outlined in training Qs-GnuBuildEnvironment. The example is similar but simplified to use a single source file. Without discussion we will identify the contents of the input files and abbreviated procedure for setting up the build environment.

Base Example

configure.ac

AC_PREREQ(2.59)
AC_INIT(FULL-PACKAGE-NAME, VERSION, BUG-REPORT-ADDRESS)
AM_INIT_AUTOMAKE(someApp,2.1)
AC_CONFIG_HEADER([config.h])
AC_PROG_CC
AC_OUTPUT(Makefile)

Makefile.am

bin_PROGRAMS=someApp
someApp_SOURCES=myMain.c

myMain.c

#include <stdio.h>
#include <stdlib.h>

int main()
{
printf("(%s:%d) main process initializing\n",__FILE__,__LINE__);
printf("(%s:%d) main process terminating\n",__FILE__,__LINE__);
return EXIT_SUCCESS;
}

Brief Build System Setup

Given the configure.ac, Makefile.am, and myMain.c files stepping through the following procedure will result in building the someApp application.
$ aclocal
$ autoconf
$ autoheader
$ touch NEWS README AUTHORS ChangeLog
$ automake -a
$ ./configure
$ make


Example Extension

The autoconf utility allows examining the target system for features. A common action for applications is to allow autoconf to define C preprocessor symbols based on the result of such feature tests.

configure.ac

AC_PREREQ(2.59)
AC_INIT(FULL-PACKAGE-NAME, VERSION, BUG-REPORT-ADDRESS)
AM_INIT_AUTOMAKE(someApp,2.1)
AC_CONFIG_HEADER([config.h])
AC_PROG_CC
AC_CHECK_HEADER([popt.h],
[AC_DEFINE([HAVE_POPT_H], [1],
[Define to 1 if you have <popt.h>.])],
[AC_MSG_ERROR([Sorry, can't do anything for you])])
AC_OUTPUT(Makefile)

After making the above changes to the configure.ac file you must repeat the procedure for re-establishing the build environment.
$ aclocal
$ autoheader
$ automake –add-missing –copy
$ autoconf

Running ./configure will now execute the AC_CHECK_HEADER feature test, giving the following output:
checking for popt.h... no
configure: error: Sorry, can't do anything for you
or
checking for popt.h... yes
If the feature test for popt.h fails configure outputs an error and aborts the script. If the feature test for popt.h succeeds AC_DEFINE will define the HAVE_POPT_H precompiler symbol as 1, defined in config.h. This test effectively defines a critical dependency on popt.h, which isn't exactly what we intended. What we intended is the definition of a precompiler symbol based on the autoconf examination of the system. If we update the configure.ac file as follows we will get the desired result.

configure.ac

AC_PREREQ(2.59)
AC_INIT(FULL-PACKAGE-NAME, VERSION, BUG-REPORT-ADDRESS)
AM_INIT_AUTOMAKE(someApp,2.1)
AC_CONFIG_HEADER([config.h])
AC_PROG_CC
AC_CHECK_HEADER([popt.h],
[AC_DEFINE([HAVE_POPT_H], [1],
[Define to 1 if you have <popt.h>.])])
AC_OUTPUT(Makefile)

Next, by placing precompiler commands we can test for the definition of the HAVE_POPT_H symbol. This allows us to place conditional code in our application. In the absence of popt libraries we could substitute our own command line parsing routine.

myMain.c

#include <stdio.h>
#include <stdlib.h>

int main()
{
printf("(%s:%d) main process initializing\n",__FILE__,__LINE__);
#ifdef HAVE_POPT_H
printf("(%s:%d) whee...we have popt.h\n",__FILE__,__LINE__);
#else
printf("(%s:%d) rats...we don't have popt.h\n",__FILE__,__LINE__);
#endif
printf("(%s:%d) main process terminating\n",__FILE__,__LINE__);
return EXIT_SUCCESS;
}


Conclusion

In closing, authoring a Gnu Build Environment compliant product offers flexibility to the user by incorporating a standardized means for examining the target system and ensuring the required product dependencies are met and offers a well defined configuration and build environment when distributing your product by source. Incorporation of autoconf feature examination allows the conditional definition of precompiler symbols. These symbols can be used to place conditional code in our application, giving us alternatives to continue in the absence of optional features.

References

  1. Introduction to the GNU Build System; FSK Consulting