Build System Status Update 2013-05-14

May 13, 2013 at 07:35 PM | categories: Mozilla, build system

I'd like to make an attempt at delivering regular status updates on the Gecko/Firefox build system and related topics. Here we go with the first instance. I'm sure I missed awesomeness. Ping me and I'll add it to the next update.

MozillaBuild Windows build environment updated

Kyle Huey released version 1.7 of our Windows build environment. It contains a newer version of Python and a modern version of Mercurial among other features.

I highly recommend every Windows developer update ASAP. Please note that you will likely encounter Python errors unless you clobber your build.

New submodule and peers

I used my power as module owner to create a submodule of the build config module whose scope is the (largely mechanical) transition of content from Makefile.in to moz.build files. I granted Joey Armstrong and Mike Shal peer status for this module. I would like to eventually see both elevated to build peers of the main build module.

moz.build transition

The following progress has been made:

  • Mike Shal has converted variables related to defining XPIDL files in bug 818246.
  • Mike Shal converted MODULE in bug 844654.
  • Mike Shal converted EXPORTS in bug 846634.
  • Joey Armstrong converted xpcshell test manifests in bug 844655.
  • Brian O'Keefe converted PROGRAM in bug 862986.
  • Mike Shal is about to land conversion of CPPSRCS in bug 864774.

Non-recursive XPIDL generation

In bug 850380 I'm trying to land non-recursive building of XPIDL files. As part of this I'm trying to combine the generation of .xpt and .h for each input .idl file into a single process call because profiling revealed that parsing the IDL consumes most of the CPU time. This shaves a few dozen seconds off of build times.

I have encounterd multiple pymake bugs when developing this patch, which is the primary reason it hasn't landed yet.

WebIDL refactoring

I was looking at my build logs and noticed WebIDL generation was taking longer than I thought it should. I filed bug 861587 to investigate making it faster. While my initial profiling turned out to be wrong, Boris Zbarsky looked into things and discovered that the serialization and deserialization of the parser output was extremely slow. He is currently trying to land a refactor of how WebIDL bindings are handled. The early results look very promising.

I think the bug is a good example of the challenges we face improving the build system, as Boris can surely attest.

Test directory reorganization

Joel Maher is injecting sanity into the naming scheme of test directories in bug 852065.

Manifests for mochitests

Jeff Hammel, Joel Maher, Ted Mielczarek, and I are working out using manifests for mochitests (like xpcshell tests) in bug 852416.

Mach core is now a standalone package

I extracted the mach core to a standalone repository and added it to PyPI.

Mach now categorizes commands in its help output.

Requiring Python 2.7.3

Now that the Windows build environment ships with Python 2.7.4, I've filed bug 870420 to require Python 2.7.3+ to build the tree. We already require Python 2.7.0+. I want to bump the point release because there are many small bug fixes in 2.7.3, especially around Python 3 compatibility.

This is currently blocked on RelEng rolling out 2.7.3 to all the builders.

Eliminating master xpcshell manifest

Now that xpcshell test manifests are defined in moz.build files, we theoretically don't need the master manifest. Joshua Cranmer is working on removing them in bug 869635.

Enabling GTests and dual linking libxul

Benoit Gerard and Mike Hommey are working in bug 844288 to dual link libxul so GTests can eventually be enabled and executed as part of our automation.

This will regress build times since we need to link libxul twice. But, giving C++ developers the ability to write unit tests with a real testing framework is worth it, in my opinion.

ICU landing

ICU was briefly enabled in bug 853301 but then backed out because it broke cross-compiling. It should be on track for enabling in Firefox 24.

Resource monitoring in mozbase

I gave mozbase a class to record system resource usage. I plan to eventually hook this up to the build system so the build system records how long it took to perform key events. This will give us better insight into slow and inefficient parts of the build and will help us track build system speed improvements over time.

Sorted lists in moz.build files

I'm working on requiring lists in moz.build be sorted. Work is happening in bug 863069.

This idea started as a suggestion on the dev-platform list. If anyone has more great ideas, don't hold them back!

Smartmake added to mach

Nicholas Alexander taught mach how to build intelligently by importing some of Josh Matthews' smartmake tool's functionality into the tree.

Source server fixed

Kyle Huey and Ted Mielczarek collaborated to fix the source server.

Auto clobber functionality

Auto clobber functionality was added to the tree. After flirting briefly with on-by-default, we changed it to opt-in. When you encounter it, it will tell you how to enable it.

Faster clobbers on automation

I was looking at build logs and identified we were inefficiently performing clobber.

Massimo Gervasini and Chris AtLee deployed changes to automation to make it more efficient. My measurements showed a Windows try build that took 15 fewer minutes to start - a huge improvement.

Upgrading to Mercurial 2.5.4

RelEng is tracking the global deployment of Mercurial 2.5.4. hg.mozilla.org is currently running 2.0.2 and automation is all over the map. The upgrade should make Mercurial operations faster and more robust across the board.

I'm considering adding code to mach or the build system that prompts the user when her Mercurial is out of date (since an out of date Mercurial can result in a sub-par user experience).

Parallelize reftests

Nathan Froyd is leading an effort to parallelize reftest execution. If he pulls this off, it could shave hours off of the total automation load per checkin. Go Nathan!

Overhaul of MozillaBuild in the works

I am mentoring a pair of interns this summer. I'm still working out the final set of goals, but I'm keen to have one of them overhaul the MozillaBuild Windows development environment. Cross your fingers.


Mozilla Build System Brain Dump

May 13, 2013 at 05:25 PM | categories: build system, Mozilla, Firefox, mach

I hold a lot of context in my head when it comes to the future of Mozilla's build system and the interaction with it. I wanted to perform a brain dump of sorts so people have an idea of where I'm coming from when I inevitably propose radical changes.

The sad state of build system interaction and the history of mach

I believe that Mozilla's build system has had a poor developer experience for as long as there has been a Mozilla build system. Getting started with Firefox development was a rite of passage. It required following (often out-of-date) directions on MDN. It required finding pages through MDN search or asking other people for info over IRC. It was the kind of process that turned away potential contributors because it was just too damn hard.

mach - while born out of my initial efforts to radically change the build system proper - morphed into a generic command dispatching framework by the time it landed in mozilla-central. It has one overarching purpose: provide a single gateway point for performing common developer tasks (such as building the tree and running tests). The concept was nothing new - individual developers had long coded up scripts and tools to streamline workflows. Some even published these for others to use. What set mach apart was a unified interface for these commands (the mach script in the top directory of a checkout) and that these productivity gains were in the tree and thus easily discoverable and usable by everybody without significant effort (just run mach help).

While mach doesn't yet satisfy everyone's needs, it's slowly growing new features and making developers' lives easier with every one. All of this is happening despite that there is not a single person tasked with working on mach full time. Until a few months ago, mach was largely my work. Recently, Matt Brubeck has been contributing a flurry of enhancements - thanks Matt! Ehsan Akhgari and Nicholas Alexander have contributed a few commands as well! There are also a few people with a single command to their name. This is fulfilling my original vision of facilitating developers to scratch their own itches by contributing mach commands.

I've noticed more people referencing mach in IRC channels. And, more people get angry when a mach command breaks or changes behavior. So, I consider the mach experiment a success. Is it perfect, no. If it's not good enough for you, please file a bug and/or code up a patch. If nothing else, please tell me: I love to know about everyone's subtle requirements so I can keep them in mind when refactoring the build system and hacking on mach.

The object directory is a black box

One of the ideas I'm trying to advance is that the object directory should be considered a black box for the majority of developers. In my ideal world, developers don't need to look inside the object directory. Instead, they interact with it through condoned and supported tools (like mach).

I say this for a few reasons. First, as the build config module owner I would like the ability to massively refactor the internals of the object directory without disrupting workflows. If people are interacting directly with the object directory, I get significant push back if things change. This inevitably holds back much-needed improvements and triggers resentment towards me, build peers, and the build system. Not a good situation. Whereas if people are indirectly interacting with the object directory, we simply need to maintain a consistent interface (like mach) and nobody should care if things change.

Second, I believe that the methods used when directly interacting with the object directory are often sub-par compared with going through a more intelligent tool and that productivity suffers as a result. For example, when you type make in inside the object directory you need to know to pass -j8, use make vs pymake, and that you also need to build toolkit/library, etc. Also, by invoking make directly, you bypass other handy features, such as automatic compiler warning aggregation (which only happens if you invoke the build system through mach). If you go through a tool like mach, you should automatically get the most ideal experience possible.

In order for this vision to be realized, we need massive improvements to tools like mach to cover the missing workflows that still require direct object directory interaction. We also need people to start using mach. I think increased mach usage comes after mach has established itself as obviously superior to the alternatives (I already believe it offers this for tasks like running tests).

I don't want to force mach upon people but...

Nobody likes when they are forced to change a process that has been familiar for years. Developers especially. I get it. That's why I've always attempted to position mach as an alternative to existing workflows. If you don't like mach, you can always fall back to the previous workflow. Or, you can improve mach (patches more than welcome!). Having gone down the please-use-this-tool-it's-better road before at other organizations, I strongly believe that the best method to incur adoption of a new tool is to gradually sway people through obvious superiority and praise (as opposed to a mandate to switch). I've been trying this approach with mach.

Lately, more and more people have been saying things like we should have the build infrastructure build through mach instead of client.mk and why do we need testsuite-targets.mk when we have mach commands. While I personally feel that client.mk and testsuite-targets.mk are antiquated as a developer-facing interface compared to mach, I'm reluctant to eliminate them because I don't like forcing change on others. That being said, there are compelling reasons to eliminate or at least refactor how they work.

Let's take testsuite-targets.mk as an example. This is the make file that provides the targets to run tests (like make xpcshell-test and make mochitest-browser-chrome). What's interesting about this file is that it's only used in local builds: our automation infrastructure does not use testsuite-targets.mk! Instead, mozharness and the old buildbot configs manually build up the command used to invoke the test harnesses. Initially, the mach commands for running tests simply invoked make targets defined in testsuite-targets.mk. Lately, we've been converting the mach commands to invoke the Python test runners directly. I'd argue that the logic for invoke the test runner only needs to live in one place in the tree. Furthermore as a build module peer, I have little desire to support multiple implementations. Especially considering how fragile they can be.

I think we're trending towards an outcome where mach (or the code behind mach commands) transitions into the authoratitive invocation method and legacy interfaces like client.mk and testsuite-targets.mk are reimplemented to either call mach commands or the same routine that powers them. Hopefully this will be completely transparent to developers.

The future of mozconfigs and environment configuration

mozconfig files are shell scripts used to define variables consumed by the build system. They are the only officially supported mechanism for configuring how the build system works.

I'd argue mozconfig files are a mediocre solution at best. First, there's the issue of mozconfig statements that don't actually do anything. I've seen no-op mozconfig content cargo culted into the in-tree mozconfigs (used for the builder configurations)! Oops. Second, doing things in mozconfig files is just awkward. Defining the object directory requires mk_add_options MOZ_OBJDIR=some-path. What's mk_add_options? If some-path is relative, what is it relative to? While certainly addressable, the documentation on how mozconfig files work is not terrific and fails to explain many pitfalls. Even with proper documentation, there's still the issue of the file format allowing no-op variable assignments to persist.

I'm very tempted to reinvent build configuration as something not mozconfigs. What exactly, I don't know. mach has support for ini-like configuration files. We could certainly have mach and the build system pull configs from the same file.

I'm not sure what's going to happen here. But deprecating mozconfig files as they are today is part of many of the options.

Handling multiple mozconfig files

A lot of developers only have a single mozconfig file (per source tree at least). For these developers, life is easy. You simply install your mozconfig in one of the default locations and it's automagically used when you use mach or client.mk. Easy peasy.

I'm not sure what the relative numbers are, but many developers maintain multiple mozconfig files per source tree. e.g. they'll have one mozconfig to build desktop Firefox and another one for Android. They may have debug variations of each.

Some developers even have a single mozconfig file but leverage the fact that mozconfig files are shell scripts and have their mozconfig dynamically do things depending on the current working directory, value of an environment variable, etc.

I've also seen wrapper scripts that glorify setting environment variables, changing directory, etc and invoke a command.

I've been thinking a lot about providing a common and well-supported solution for switching between active build configurations. Installing mach on $PATH goes a long way to facilitate this. If you are in an object directory, the mozconfig used when that object directory was created is automatically applied. Simple enough. However, I want people to start treating object directories as black boxes. So, I'd rather not see people have their shell inside the object directory.

Whenever I think about solutions, I keep arriving at a virtualenv-like solution. Developers would potentially need to activate a Mozilla build environment (similar to how Windows developers need to launch MozillaBuild). Inside this environment, the shell prompt would contain the name of the current build configuration. Users could switch between configurations using mach switch or some other magic command on the $PATH.

Truth be told, I'm skeptical if people would find this useful. I'm not sure it's that much better than exporting the MOZCONFIG environment variable to define the active config. This one requires more thought.

The integration between the build environment and Python

We use Python extensively in the build system and for common developer tasks. mach is written in Python. moz.build processing is implemented in Python. Most of the test harnesses are written in Python.

Doing practically anything in the tree requires a Python interpreter that knows about all the Python code in the tree and how to load it.

Currently, we have two very similar Python environments. One is a virtualenv created while running configure at the beginning of a build. The other is essentially a cheap knock-off that mach creates when it is launched.

At some point I'd like to consolidate these Python environments. From any Python process we should have a way to automatically bootstrap/activate into a well-defined Python environment. This certainly sounds like establishing a unified Python virtualenv used by both the build system and mach.

Unfortunately, things aren't straightforward. The virtualenv today is constructed in the object directory. How do we determine the current object directory? By loading the mozconfig file. How do we do that? Well, if you are mach, we use Python. And, how does mach know where to find the code to load the mozconfig file? You can see the dilemma here.

A related issue is that of portable build environments. Currently, a lot of our automation recreates the build system's virtualenv from its own configuration (not that from the source tree). This has and will continue to bite us. We'd really like to package up the virtualenv (or at least its config) with tests so there is no potential for discrepancy.

The inner workings of how we integrate with Python should be invisible to most developers. But, I figured I'd capture it here because it's an annoying problem. And, it's also related to an activated build environment. What if we required all developers to activate their shell with a Mozilla build environment (like we do on Windows)? Not only would this solve Python issues, but it would also facilitate simpler config switching (outlined above). Hmmm...

Direct interaction with the build system considered harmful

Ever since there was a build system developers have been typing make (or make.py) to build the tree. One of the goals of the transition to moz.build files is to facilitate building the tree with Tup. make will do nothing when you're not using Makefiles! Another goal of the moz.build transition is to start derecursifying the make build system such that we build things in parallel. It's likely we'll produce monolithic make files and then process all targets for a related class IDLs, C++ compilation, etc in one invocation of make. So, uh, what happens during a partial tree build? If a .cpp file from /dom/src/storage is being handled by a monolithic make file invoked by the Makefile at the top of the tree, how does a partial tree build pick that up? Does it build just that target or every target in the monolithic/non-recursive make file?

Unless the build peers go out of our way to install redundant targets in leaf Makefiles, directly invoking make from a subdirectory of the tree won't do what it's done for years.

As I said above, I'm sympathetic to forced changes in procedure, so it's likely we'll provide backwards-compatibile behavior. But, I'd prefer to not do it. I'd first prefer partial-tree builds are not necessary and a full tree build finishes quickly. But, we're not going to get there for a bit. As an alternative, I'll take people building through mach build. That way, we have an easily extensible interface on which to build partial tree logic. We saw this recently when dumbmake/smartmake landed. And, going through mach also reinforces my ideal that the object directory is a black box.

Semi-persistent state

Currently, most state as it pertains to a checkout or build is in the object directory. This is fine for artifacts from the build system. However, there is a whole class of state that arguably shouldn't be in the object directory. Specifically, it shouldn't be clobbered when you rebuild. This includes logs from previous builds, the warnings database, previously failing tests, etc. The list is only going to grow over time.

I'd like to establish a location for semi-persistant state related to the tree and builds. Perhaps we change the clobber logic to ignore a specific directory. Perhaps we start storing things in the user's home directory. Perhaps we could establish a second object directory named the state directory? How would this interact with build environments?

This will probably sit on the backburner until there is a compelling use case for it.

The battle against C++

Compiling C++ consumes the bulk of our build time. Anything we can do to speed up C++ compilation will work wonders for our build times.

I'm optimistic things like precompiled headers and compiling multiple .cpp files with a single process invocation will drastically decrease build times. However, no matter how much work we put in to make C++ compilation faster, we still have a giant issue: dependency hell.

As shown in my build system presentation a few months back, we have dozens of header files included by hundreds if not thousands of C++ files. If you change one file: you invalidate build dependencies and trigger a rebuild. This is why whenever files like mozilla-config.h change you are essentially confronted with a full rebuild. ccache may help if you are lucky. But, I fear that as long as headers proliferate the way they do, there is little the build system by itself can do.

My attitude towards this is to wait and see what we can get out of precompiled headers and the like. Maybe that makes it good enough. If not, I'll likely be making a lot of noise at Platform meetings requesting that C++ gurus brainstorm on a solution for reducing header proliferation.

Conclusion

Belive it or not, these are only some of the topics floating around in my head! But I've probably managed to bore everyone enough so I'll call it a day.

I'm always interested in opinions and ideas, especially if they are different from mine. I encourage you to leave a comment if you have something to say.


moz.build Files and the Firefox Build System

February 28, 2013 at 07:45 PM | categories: Mozilla, Firefox, build system

The next time you update mozilla-central you may notice some significant changes with the build system. That's because this morning we finally landed the start of a massive overhaul of the build system! There are many end goals to this effort. The important ones for most will be faster build times and a build system that is friendlier to make changes to.

Introducing moz.build Files

If you look in the tree, you'll notice that nearly every directory now has a moz.build file.

moz.build files are what we are using to define the build system. Think of them each as a descriptor that describes how to build its own part of the tree. An individual moz.build file will contain the C++ sources to compile, the headers to export, the tests to run, etc. Eventually. Currently, they are limited to directory traversal information.

moz.build files essentially add a level of indirection between the build system definition and how the tree is actually built. Before moz.build files, the same metadata we are now capturing in moz.build files (or plan to capture) was captured in Makefile.in files. We performed simple variable substitution on these Makefile.in files to produce Makefile files in the object directory. These Makefile files were used by GNU Make (or Pymake on Windows) to build the tree.

As I outlined in Improving Mozilla's Build System, Makefile.in are suboptimal for a number of reasons. The important bit is they essentially tie us to the use of make (recursive or otherwise). We are very interested in supporting modern build systems such as Tup (the theory being they will build the tree faster).

Enter moz.build files. Storing our build configuration in moz.build files allows us to decouple the definition of the build system from the tool used to build it.

How moz.build Files Work

At the tail end of configure, the build system invokes the config.status script in the object directory. The role of config.status is to combine the information collected during configure with the build configuration obtained from moz.build files and take the necessary actions to ensure the build backend (make) can build the tree.

Before today, config.status essentially iterated over the source tree and converted Makefile.in files to Makefile in the object directory. Things are now slightly more complicated with moz.build files.

When config.status runs, it starts with the root moz.build from the source tree. It feeds this file into a Python interpreter. It then looks for special variables like DIRS and PARALLEL_DIRS to determine which directories contain additional moz.build files. It then descends into all the referenced directories, reading their moz.build files. While this is happening, we are converting the results of each moz.build file execution into backend.mk files that make knows how to build. It also performs the Makefile.in to Makefile conversion like it always has. When the whole process has finished, the object directory contains a bunch of Makefile and backend.mk files. make runs like it always has. The only real difference is some variables are coming from the moz.build-derived backend.mk files instead of Makefile.

This is just a brief overview, of course. If you want to know more, see the code in /python/mozbuild/mozbuild/frontend and /python/mozbuild/mozbuild/backend.

Build System Future

With the introduction of moz.build files, the intent is to eventually completely eliminate Makefile.in and have all the build definition live in moz.build files.

Doing this all at once would have been next to impossible. So, we decided to eliminate Makefile.in gradually. The first step is what landed today: essentially moving DIRS variables out of Makefile.in and into moz.build files. Next, we will be eliminating empty Makefile.in (bug 844635) and will be moving more parts of the build definition from Makefile.in to moz.build files. The next part to move over will likely be IDLs (bug 818246). After that, it may be exported files (EXPORTS in Makefile.in parlance). And repeat until we have no more Makefile.in in the tree.

Each migration of build definition data to moz.build files will likely occur in two phases:

  1. A largely mechanical move of the variables from Makefile.in to moz.build.
  2. Better build backend integration resulting from the move.

In phase #1, we will essentially cut and paste variable assignments to moz.build files. make will read the same variables it does today and perform the same actions. The only difference is the values in these variables will be defined in moz.build files.

In phase #2, we will leverage the fact that our build definition now has an API. We will change our existing make backend to be more efficient. For example, we should soon be able to compile IDLs and copy exported headers without make traversing through the directory tree at build time. We will be able to do this because the moz.build traversal at pre-build time sees data from all moz.build files and with this complete world view is able to produce more optimal make files than what you would get if you recursed into multiple directories. In short: it will make the build faster.

Once we have a sufficient portion of the build definition moved to moz.build files we will be able to experiment with new build backends (like Tup), look into automatic Visual Studio project generation, and more easily experiment with different ways of building (such as precompiled headers, fewer compiler process invocations, etc). These should all contribute to faster build times.

Frequently Asked Questions

What impact will I see from this change?

If you never touched Makefile.in files in the tree, you should not notice anything different about how the tree builds or how the build system works. You should have nothing to fear.

The most obvious changes to the source tree are:

  1. There is a moz.build file in almost every directory now.
  2. The variables related to directory traversal (those containing DIRS in their name) are now defined in moz.build files instead of Makefile.in.
  3. If your Makefile.in contains a variable that has been moved to moz.build files, make will spew an error when processing that file and the build will fail.

Will this change how I build?

It shouldn't. You should build the tree just like you always have. Most of you won't notice any differences.

If you see something weird, speak up in #build or file a bug if you are really confident it is wrong.

What are the risks to this change?

The migration of variables from Makefile.in to moz.build files is largely mechanical and significant portions are done manually. This can be a mind-numbing and tedious process. Not helping things is the fact that Splinter's review interface for these kinds of patches is hard to read.

This all means that there is a non-trivial risk for transcription errors. All it takes is an inverted conditional block and all of a sudden large parts of the tree are no longer built, for example.

We have established bug 846825 to investigate any oddities from the initial transfer. Developers are encouraged to help with the effort. Please spot check that your directories are still being built, tests run, etc. Pay special attention to changes made in the last 4 months as these parts of Makefile.in would have been bit rotted and more prone to data loss.

Parts of the tree not enabled in standard configurations are more prone to breakage due to less testing. i.e. build configurations not captured by TBPL have a higher chance of breaking.

Will this make the tree build faster?

Not yet. But eventually it will. This initial landing paves the groundwork to making the tree build faster (see the Future section above).

I see a lot of empty moz.build files!

Yes. Sorry about that. The good news is they shouldn't be empty for long. As things move from Makefile.in to moz.build we'll see fewer and fewer empty moz.build files. We'll also see fewer and fewer Makefile.in files once we start deleting empty Makefile.in.

If you want to know why we must have empty files, it's mainly for validation. If we allowed moz.build files to be optional, how would you detect a typo in a directory name? Directory exists? What if that directory exists but isn't supposed to have a moz.build file?

You bitrotted my patches!

Yes. I'm sorry. The transition period to moz.build files could be a little messy. There will be lots of changes to Makefile.in and moz.build files and lots of chances for bit rot. Uplifts could be especially nasty. (Although I don't believe many uplifts involve significant changes to the build configuration.)

This all means there is a strong incentive for us to complete the transition as quickly as possible.

Can I help with the transition to moz.build files?

Yes!

The transition is largely mechanical (at least phase #1). If you are interested in moving a variable or set of variables, hop in #build on IRC and speak up!

You said moz.build files are actually Python files?!

Yes they are! However, they are executed in a very tightly controlled sandbox. You can't import modules, open files, etc. UPPERCASE variable names are reserved and only a few functions are exposed. If you attempt to assign to an unknown UPPERCASE variable or assign an invalid value, an error will occur. This is already much better than Makefile because we can now detect errors earlier in the build process (rather than 15 minutes into a build).

What variables and functions are available in moz.build files?

If you run |./mach mozbuild-reference| you will see a print-out of all the variables, functions, and symbols that are exposed to the Python sandbox that moz.build files execute in. There are even tests that will fail the build if the sandbox contains symbols not in this output!

The output should be valid reSTructuredText (in case you want to convert to HTML for reading in your browser).

What if a moz.build file contains an error?

The build will break.

A lot of work has gone into making the output of moz.build errors human friendly and actionable. If you do something wrong, it won't just complain: it will tell you how to fix it!

Besides build times, how else will this improve the build system?

There are several ways!

As mentioned above, moz.build are more strict about what data is allowed to be defined. If you assign to an UPPERCASE variable, that variable must be known to the sandbox or else the assignment will error. This means that if you assign to an UPPERCASE variable, you know it has a side-effect. No more cargo culting of old, meaningless variables!

To change the behavior of moz.build files (add new variables or functions, change how makefile generation works, etc) will require changes to the code in /python/mozbuild. This code belongs squarely to the build module and requires appropriate review. A problem with Makefiles is that they have lots of foot guns by default and its easy for self-inflicted wounds to land in the tree without explicit build peer review. This problem largely goes away with moz.build files because the sandbox takes away all of make's foot guns.

The output of a moz.build execution is essentially a static data structure. It's easy to validate them for conformance. If we discover bad practices in our build definition, we can centrally add tests for them and enforce best practices.

We will also see user experience wins by moving data to moz.build files. Take mochitests for an example. We currently have several flavors (plain, browser, chrome, etc). Sometimes you cannot distinguish the flavor by the filename alone. With moz.build files, it will be easier to answer questions like "what mochitest flavor is this file?" mach could hook into this so you can type |mach mochitest path/to/file.html| instead of |mach mochitest-plain path/to/file.html|. Even better, you should just be able to type |mach path/to/test.html| and mach knows from the build definition that path/to/test.html is a plain mochitest file and assumes you want to run it. There are dozens of small development workflow wins to be gained here!

If I change a moz.build file, what happens?

If you change a moz.build file, then make should detect that it has changed and it will update the dynamically generated backend.mk file and reinvoke the requested build action. This should all happen automatically (just like Makefile.in to Makefile conversion works automatically).

My build seems to pause for a few seconds before starting!

A change to any moz.build file will cause a full traversal of the entire moz.build tree. On modern machines, this should only take 1-3 seconds. If your source tree is not in the page cache (and you need to load moz.build files from disk) or if you are on older hardware, this could be a little slower.

This is an unfortunate side-effect of having a whole world view of the build definition. The build delay incurred by these full scans should eventually be cancelled out by build backend optimizations resulting from having this whole world view, however.

The good news is this full scan should only occur if a mozbuild file changes. And, if you are performing make recursion, it should only happen once (not in every directory). If you notice multiple moz.build scanning-related pauses during builds, please file a bug in Core :: Build Config!

Finally, we are performing the reads on a single thread currently. We can throw more cores at the task if someone codes up a patch.

What happened to allmakefiles.sh?

It has been sacked. allmakefiles.sh was an optimization to perform all the Makefile.in to Makefile conversion in one go. The directory traversal performed by moz.build reading effectively replaces the role of allmakefiles.sh. Not only that, but the moz.build build definition is always up to date! allmakefiles.sh was typically out of sync with reality and was a burden to maintain.

Did we just invent our own build system?

Kinda. We invented a generic Python sandboxing infrastructure. Then we hooked up code to populate it with variables from our build system and told it how to perform file traversal by reading specific variables set during file execution. Then we hooked up code for taking the evaluated results of all those sandboxes and convert them into make files.

Conceptually, what we invented is like GYP but with a different config file format. We have dabbled with the idea of converting the parsed build definition into GYP classes and then leveraging GYP to produce Makefiles, Ninja files, Visual Studio Projects, etc. This would an interesting experiment!

If you want to call it a build system, call it a build system. However, it is currently tightly coupled to Mozilla's needs, so you can't just use it anywhere. The concept might be worth exploring, however.

Is there anything else you'd like to share?

I think we set the record for most parts in a bug: 61. Although, they are numbered 1-17, 19-20. Part 18 has 30+ sub-parts using letters from the English and Greek alphabet for identifiers. Part 61 uses the infinity symbol as its number. See the pushlog.

Finally, I'd like to thank everyone who helped with this effort. The bug itself was only 6 months old and had active development off and on for a lot of it. Ted Mielczarek and Mike Hommey provided invaluable feedback on the core build system patches. A number of module owners stepped in to lend eyes to the mechanical conversion of their files. Last but not least, Ms2ger provided invaluable review aid on many of the patches. The work was so good that we all agreed that an Ms2ger f+ was good enough for a build peer rs! If reviewing the patches wasn't enough, Ms2ger also oversaw the tree closure and merging of the landing. I don't know how I will repay this debt.

Any more questions?

If you have more questions, drop in #build on irc.mozilla.org and ask away.


Firefox Build System Presentation

November 30, 2012 at 02:00 PM | categories: Mozilla, Firefox, build system

In case you missed it, I gave a presentation on the state of Firefox's build system yesterday.

You can watch it and view the slides online.

If you build Firefox from source regularly, you should definitely at least skim through the slide deck.

I'm not an HTML expert, so my apogolies for bad UI on the interactive slides. You may need to press enter to select items in dropdown menus. Also, the interactive slides are a bit resource intensive. If the slide deck is really slow, turn off those elements. I've also only tested the slides in Firefox 19 and 20. My apologies if they don't work everywhere.


Visual Studio Project Generation for mozilla-central

August 28, 2012 at 12:00 PM | categories: Mozilla, Firefox, build system

I have very alpha support for Visual Studio project generation for mozilla-central that daring people can dogfood.

I want to emphasize that this is extremely alpha. Normally, I wouldn't release things as fragile as they are. But, I know Windows developers sorely miss Visual Studio, especially IntelliSense. The current Visual Studio projects support IntelliSense, so I want to get this in the hands of Windows developers ASAP.

The current directions for making this all work are a bit hacky. Things will change once things have matured. For now, please excuse the mess.

First, you will need to grab the code. If you use Git, set up a remote to my repository:

git remote add indygreg git://github.com/indygreg/mozilla-central.git
git fetch indygreg

The branch of interest is build-splendid. I periodically rebase this branch on top of master. You have been warned.

You can switch to this branch:

git checkout -b build-splendid indygreg/build-splendid

Alternatively, you can squash it down to a single commit and merge it into your local branch. Once you've done that, you can record the SHA-1 of the commit and cherry-pick that wherever you like!

git merge --squash indygreg/build-splendid
git commit

In the current state, you need to build the tree or the Visual Studio projects will complain about missing files. It doesn't matter if you build the tree before or after Visual Studio projects are generated. But, we might as well get it out of the way. From your MozillaBuild environment, run:

./mach build

That should just work. If it doesn't, you may need to configure mach.ini. See my previous post on how to configure mach.ini. As a reference, my Windows config is:

[build]

configure_extra = --disable-webgl

[compiler]

[paths]
source_directory = c:\dev\src\mozilla-central-git
object_directory = c:\dev\src\mozilla-central-git\objdir

Now, to generate Visual Studio project files:

./mach backendconfig visualstudio

That should take about a minute to finish. When it's done, it should have created objdir/msvc/mozilla.sln. You should be able to load that in Visual Studio!

You will need to regenerate Visual Studio project files when the build config changes. As a rule of thumb, do this every time you pull source. You don't need to perform a full build before you generate Visual Studio files (you do need to perform configure, however). However, if you have not performed a full build, Visual Studio may not be able to find some files, like headers generated from IDLs.

Please close the solution before regenerating the project files. If you don't, Visual Studio puts up a modal dialog for each project file that changed and you have to click through over a hundred of these. It's extremely frustrating. I'm investigating workarounds.

Current State

Currently, it only generates projects for C/C++ compilation (libraries). I still need to add support for IDL, headers, etc. However, each project has proper compiler flags, header search paths, etc. So, IntelliSense is happy and some things do manage to compile!

Many parts are broken and sub-par.

I've only tested on Visual Studio 2008. If you are running Visual Studio \2010, you can try to upgrade the solution. This may work. The backend supports generating solutions for different versions. But, I haven't tested things work on non-2008 and I don't want to expose untested behavior.

Compiling within Visual Studio works for some things. On my system, I get a lot of nullptr not defined errors. I'm not sure why. This will hopefully be worked out soon.

If you do manager to compile within Visual Studio, the output files don't go in the right places. So, if you do a build from the command-line, it will have to re-compile to pick up changes.

Project names are based on the name of the library they produce. I'm not sure if this is the best solution.

Project dependencies are not set up. They will be added later.

Projects for linking libxul or building firefox.exe are not yet provided. Along the same vein, debugging support is not built-in. I'm working on it.

Basically, IntelliSense works. You can now use Visual Studio as a rich editor. Hopefully this is a step in the right direction.

I'm anxious to hear if this works for other people. Please leave comments!


« Previous Page -- Next Page »