Waf BOOK
Waf BOOK
Waf BOOK
Contents
Download and installation 1.1 Obtaining the Waf le . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.1 1.1.2 1.2 Downloading and using the Waf binary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Building Waf from the source code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1 1 1 1 2 2 3 3 5 5 5 5 6 7 7 8 9
Using the waf le . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.1 1.2.2 1.2.3 Permissions and aliases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Local waib folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Portability concerns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Projects and commands 2.1 Waf commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.1 2.1.2 2.1.3 2.2 Declaring Waf commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chaining Waf commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Using several scripts and folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Waf project denition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2.1 2.2.2 2.2.3 2.2.4 Conguring a project (the congure command) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Removing generated les (the distclean command) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Packaging the project sources (the dist command) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3
The build commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.3.1 2.3.2 2.3.3 Building targets (the build command) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Cleaning the targets (the clean command) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 More build commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 14
Using persistent data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.1 3.1.2 Sharing data with the build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Conguration set usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2
Conguration utilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2.1 3.2.2 Conguration methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Loading and using Waf tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.3 3.3
Multiple congurations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Exception handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.3.1 3.3.2 Launching and catching conguration exceptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 Transactions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 22
Builds 4.1
Essential build concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.1.1 4.1.2 4.1.3 4.1.4 Build order and dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Direct task declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Task encapsulation by task generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Overview of the build phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2
More build options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.2.1 4.2.2 4.2.3 4.2.4 Executing specic routines before or after the build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 Installing les . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Listing the task generators and forcing specic task generators . . . . . . . . . . . . . . . . . . . . . . . 28 Execution step by step for debugging (the step command) . . . . . . . . . . . . . . . . . . . . . . . . . 28 30
Design of the node class . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1.1 5.1.2 The node tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 Node caching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2
General usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.2.1 5.2.2 5.2.3 Searching and creating nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Listing les and folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Path manipulation: abspath, path_from . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.3
BuildContext-specic methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 5.3.1 5.3.2 5.3.3 Source and build nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Using Nodes during the build phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Nodes, tasks, and task generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 36
Custom commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 6.1.1 6.1.2 6.1.3 Context inheritance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Command composition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Binding a command from a Waf tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
6.2
Custom build outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.2.1 6.2.2 Multiple congurations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Changing the output directory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 6.2.2.1 6.2.2.2 Variant builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Conguration sets for variants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
43
Task execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 7.1.1 7.1.2 7.1.3 7.1.4 Main actors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 Build groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 The Producer-consumer system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Task states and status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
7.2
Build order constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 7.2.1 7.2.2 The method set_run_after . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Computed constraints 7.2.2.1 7.2.2.2 7.2.2.3 7.2.3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
7.3
Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 7.3.1 7.3.2 Task signatures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Explicit dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 7.3.2.1 7.3.2.2 7.3.3 7.3.4 Input and output nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Global dependencies on other nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
7.4
Task tuning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 7.4.1 7.4.2 7.4.3 Class access . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Scriptlet expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Direct class modications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 7.4.3.1 7.4.3.2 Always execute . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 File hashes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 55
Rule-based task generators (Make-like) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 8.1.1 8.1.2 8.1.3 8.1.4 8.1.5 Declaration and usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Rule functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Shell usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Inputs and outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Dependencies on le contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
8.2
Name and extension-based le processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 8.2.1 8.2.2 8.2.3 8.2.4 Refactoring repeated rule-based task generators into implicit rules . . . . . . . . . . . . . . . . . . . . . 62 Chaining more than one command . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 Scanner methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 Extension callbacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
General purpose task generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 8.3.1 8.3.2 8.3.3 8.3.4 8.3.5 8.3.6 Task generator denition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 Executing the method during the build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 Task generator features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Task generator method execution order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Adding or removing a method for execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Expressing abstract dependencies between task generators . . . . . . . . . . . . . . . . . . . . . . . . . 73 75
Common script for C, C++ and D applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 9.1.1 9.1.2 Predened task generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Additional attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
9.2
Include processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 9.2.1 9.2.2 9.2.3 Execution path and ags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 The Waf preprocessor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Dependency debugging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
9.3
Library interaction (use) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 9.3.1 9.3.2 Local libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Special local libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 9.3.2.1 9.3.2.2 9.3.2.3 9.3.3 Includes folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Object les . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Fake libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
9.4
Conguration helpers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 9.4.1 9.4.2 9.4.3 9.4.4 Conguration tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 Advanced tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Creating conguration headers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Pkg-cong . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 89
10 Advanced scenarios
10.1 Project organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 10.1.1 Building the compiler rst . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 10.1.2 Providing arbitrary conguration les . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 10.2 Mixing extensions and C/C++ features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 10.2.1 Files processed by a single task generator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 10.2.2 Resources shared by several task generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 10.3 Task generator methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
10.3.1 Replacing particular attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 10.3.2 Inserting special include ags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 10.4 Custom tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 10.4.1 Force the compilation of a particular task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 10.4.2 A compiler producing source les with names unknown in advance . . . . . . . . . . . . . . . . . . . . 98 11 Using the development version 101
11.1 Execution traces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 11.1.1 Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 11.1.2 Build visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 11.2 Proling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 11.2.1 Benchmark projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 11.2.2 Prole traces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 11.2.3 Optimizations tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 11.3 Waf programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 11.3.1 Setting up a Waf directory for development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 11.3.2 Specic guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 12 Waf architecture overview 106
12.1 Modules and classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 12.1.1 Core modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 12.1.2 Context classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 12.1.3 Build classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 12.2 Context objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 12.2.1 Context commands and recursion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 12.2.2 Build context and persistence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 12.3 Support for c-like languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 12.3.1 Compiled tasks and link tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 12.4 Writing re-usable Waf tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 12.4.1 Adding a waf tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 12.4.1.1 Importing the code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 12.4.1.2 Naming convention for C/C++/Fortran . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 12.4.2 Command methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 12.4.2.1 Subclassing is only for commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 12.4.2.2 Domain-specic methods are convenient for the end users . . . . . . . . . . . . . . . . . . . . 113 12.4.2.3 How to bind new methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 13 Further reading 14 Glossary 114 115
Introduction
Copyright 2010-2011 Thomas Nagy Copies of this book may be redistributed, verbatim, and for non-commercial purposes. The license for this book is by-nc-nd license.
Chapter 1
The Waf project is located on Google Code. The current Waf version requires an interpreter for the Python programming language such as cPython 2.3 to 3.1 or Jython >= 2.5.
1.1.1
The Waf binary is a python script which does not require any installation whatsoever. It may be executed directly from a writeable folder. Just rename it as waf if necessary:
$ wget http://waf.googlecode.com/files/waf-1.6.8 $ mv waf-1.6.8 waf $ python waf --version waf 1.6.8 (11500)
The waf le has its own library compressed in a binary stream in the same le. Upon execution, the library is uncompressed in a hidden folder in the current directory. The folder will be re-created if removed. This scheme enables different Waf versions to be executed from the same folders:
$ ls -ld .waf* .waf-1.6.8-2c924e3f453eb715218b9cc852291170
Note The binary le requires bzip2 compression support, which may be unavailable in some self-compiled cPython installations.
1.1.2
Building Waf requires a Python interpreter having a version number in the range 2.6-3.1. The source code is then processed to support Python 2.3, 2.4 and 2.5.
$ wget http://waf.googlecode.com/files/waf-1.6.8.tar.bz2 $ tar xjvf waf-1.6.8.tar.bz2 $ cd waf-1.6.8 $ python waf-light Configuring the project build finished successfully (0.001s) Checking for program python : /usr/bin/python Checking for python version : (2, 6, 5, final, 0)
configure finished successfully (0.176s) Waf: Entering directory /waf-1.6.8/build [1/1] create_waf: -> waf Waf: Leaving directory /waf-1.6.8/build build finished successfully (2.050s)
For older interpreters, it is possible to build the waf le with gzip compression instead of bzip2:
$ python waf-light --zip-type=gz
The les present in the folder waib/extras represent extensions (Waf tools) that are in a testing phase. They may be added to the Waf binary by using the --tools switch:
$ python waf-light --tools=compat15,swig,doxygen
The tool compat15 is required to provide some compatibility with previous Waf versions. To remove it, it is necessary to modify the initialization by changing the --prelude switch:
$ python waf-light --make-waf --prelude= --tools=swig
Finally, here is how to import an external tool and load it in the initialization. Assuming the le aba.py is present in the current directory:
def foo(): from waflib.Context import WAFVERSION print("This is Waf %s" % WAFVERSION)
The following will create a custom waf le which will import and execute foo whenever it is executed:
$ python waf-light --make-waf --tools=compat15,$PWD/aba.py --prelude=$\tfrom waflib.extras import aba\n\taba.foo() $ ./waf --help This is Waf 1.6.8 [...]
Foreign les to add into the folder extras must be given by absolute paths in the --tools switch. Such les do not have to be Python les, yet, a typical scenario is to add an initializer to modify existing functions and classes from the Waf modules. Various from the build system kit illustrate how to create custom build systems derived from Waf.
1.2
1.2.1
Because the waf script is a python script, it is usually executed by calling python on it:
$ python waf
On unix-like systems, it is usually much more convenient to set the executable permissions and avoid calling python each time:
$ chmod 755 waf $ ./waf --version waf 1.6.8 (11500)
If the command-line interpreter supports aliases, it is recommended to set the alias once:
$ alias waf=$PWD/waf $ waf --version waf 1.6.8 (11500)
Or, the execution path may be modied to point at the location of the waf binary:
$ export PATH=$PWD:$PATH $ waf --version waf 1.6.8 (11500)
In the next sections of the book, we assume that either an alias or the execution path have been set in a way that waf may be called directly.
1.2.2
Although the waf library is unpacked automatically from the waf binary le, it is sometimes necessary to keep the les in a visible folder, which may even be kept in a source control tool (subversion, git, etc). For example, the waf-light script does not contain the waf library, yet it is used to create the waf script by using the directory waflib. The following diagram represents the process used to nd the waflib directory:
1.2.3
Portability concerns
By default, the recommended Python interpreter is cPython, for which the supported versions are 2.3 to 3.1. For maximum convenience for the user, a copy of the Jython interpreter in the version 2.5 may be redistributed along with a copy of the Waf executable.
Note A project containing waf, jython2.5.jar and the source code may be used almost anywhere.
Warning The waf script must reside in a writable folder to unpack its cache les.
Chapter 2
2.1
Waf commands
Waf projects use description les of the name wscript which are python scripts containing functions and variables that may be used by Waf. Particular functions named waf commands may be used by Waf on the command-line.
2.1.1
Waf commands are really simple functions and may execute arbitrary python code such as calling other functions. They take a single parameter as input and do not have to return any particular value as in the following example:
#! /usr/bin/env python # encoding: utf-8 def
The waf command hello A waf context, used to share data between scripts
And here is how to have waf call the function hello from the command-line:
$ waf hello hello world hello finished successfully (0.001s)
2.1.2
Note The context parameter is a new object for each command executed. The classes are also different: CongureContext for congure, BuildContext for build, OptionContext for option, and Context for any other command.
2.1.3
Although a Waf project must contain a top-level wscript le, the contents may be split into several sub-project les. We will now illustrate this concept on a small project:
$ tree |-- src | -- wscript -- wscript
The commands in the top-level wscript will call the same commands from a subproject wscript le by calling a context method named recurse:
def ping(ctx): print( ping from + ctx.path.abspath()) ctx.recurse(src)
Note The method recurse, and the attribute path are available on all waf context classes
2.2
2.2.1
Although Waf may be called from any folder containing a wscript le, it is usually a good idea to have a single entry point in the scripts. Besides ensuring a consistent behaviour, it also saves the redenition of the same imports and function redenitions in all wscript les. The following concepts help to structure a Waf project: 1. Project directory: directory containing the source les that will be packaged and redistributed to other developers or to end users 2. Build directory: directory containing the les generated by the project (conguration sets, build les, logs, etc) 3. System les: les and folders which do not belong to the project (operating system les, etc) The predened command named congure is used to gather and store the information about these folders. We will now extend the example from the previous section with the following top-level wscript le: top = . 1v out = build_directory 2v
def configure(ctx): 3v print( configuring the project in + ctx.path.abspath()) def ping(ctx): print( ping from + ctx.path.abspath()) ctx.recurse(src)
v v
string representing the project directory. In general, top is set to ., except for some proprietary projects where the wscript cannot be added to the top-level, top may be set to ../.. or even some other folder such as /checkout/perforce/project string representing the build directory. In general, it is set to build, except for some proprietary projects where the build directory may be set to an absolute path such as /tmp/build. It is important to be able to remove the build directory safely, so it should never be given as . or ... the congure function is called by the congure command
| -- config.log 7v |--.lock-wafbuild 8v |-- src | -- wscript -- wscript $ waf ping ping from /tmp/execution_configure ping from /tmp/execution_configure/src ping finished successfully (0.001s) $ cd src $ waf ping 9v ping from /tmp/execution_configure ping from /tmp/execution_configure/src ping finished successfully (0.001s)
To congure the project, change to the directory containing the top-level project le The execution is called by calling waf congure The build directory was created The conguration data is stored in the folder c4che/ The command-line options and environment variables in use are stored in build.cong.py The user conguration set is stored in _cache.py Conguration log (duplicate of the output generated during the conguration) Hidden le pointing at the relevant project le and build directory Calling waf from a subfolder will execute the commands from the same wscript le used for the conguration
v v v v v v v v
Note waf congure is always called from the directory containing the wscript le
2.2.2
A command named distclean is provided to remove the build directory and the lock le created during the conguration. On the example from the previous section:
$ waf configure configuring the project in /tmp/execution_configure configure finished successfully (0.001s) $ tree -a |-- build_directory/ | |-- c4che/ | | |-- build.config.py | | -- _cache.py | -- config.log |--.lock-wafbuild -- wscript $ waf distclean 1v distclean finished successfully (0.001s)
v v
The distclean command denition is implicit (no declaration in the wscript le) The tree is reverted to its original state: no build directory and no lock le
The behaviour of distclean is fairly generic and the corresponding function does not have to be dened in the wscript les. It may be dened to alter its behaviour though, see for example the following:
top = . out = build_directory def configure(ctx): print( configuring the project) def distclean(ctx): print( Not cleaning anything!)
Upon execution:
$ waf distclean Not cleaning anything! distclean finished successfully (0.000s)
2.2.3
The dist command is provided to create an archive of the project. By using the script presented previously:
top = . out = build_directory def configure(ctx): print( configuring the project in + ctx.path.abspath())
By default, the project name and version are set to noname and 1.0. To change them, it is necessary to provide two additional variables in the top-level project le:
APPNAME = webe VERSION = 2.0 top = . out = build_directory def configure(ctx): print( configuring the project in + ctx.path.abspath())
Because the project was congured once, it is not necessary to congure it once again:
$ waf dist New archive created: webe-2.0.tar.bz2 (sha=7ccc338e2ff99b46d97e5301793824e5941dd2be) dist finished successfully (0.006s)
More parameters may be given to alter the archive by adding a function dist in the script:
def dist(ctx): ctx.base_name ctx.algo ctx.excl ctx.files = = = = foo_2.0 1v zip 2v **/.waf-1* **/*~ **/*.pyc **/*.swp **/.lock-w* ctx.path.ant_glob(**/wscript) 4v
The archive name may be given directly instead of computing from APPNAME and VERSION The default compression format is tar.bz2. Other valid formats are zip and tar.gz Exclude patterns passed to give to ctx.path.ant_glob() which is used to nd the les The les to add to the archive may be given as Waf node objects (excl is therefore ignored)
v v
2.2.4
The Waf script provides various default command-line options, which may be consulted by executing waf --help:
$ waf --help waf [command] [options] Main commands (example: ./waf build -j4) build : executes the build clean : cleans the project configure: configures the project dist : makes a tarball for redistributing the sources distcheck: checks if the project compiles (tarball from dist) distclean: removes the build directory install : installs the targets on the system list : lists the targets to execute step : executes tasks in a step-by-step fashion, for debugging uninstall: removes the targets installed Options: --version -h, --help -j JOBS, --jobs=JOBS -k, --keep -v, --verbose --nocache --zones=ZONES configure options: -o OUT, --out=OUT -t TOP, --top=TOP --prefix=PREFIX --download
show programs version number and exit show this help message and exit amount of parallel jobs (2) keep running happily even if errors are found verbosity level -v -vv or -vvv [default: 0] ignore the WAFCACHE (if set) debugging zones (task_gen, deps, tasks, etc)
build dir for the project src dir for the project installation prefix [default: /usr/local/] try to download the tools if missing
build and install options: -p, --progress -p: progress bar; -pp: ide output --targets=TARGETS task generators, e.g. "target1,target2" step options:
--files=FILES
install/uninstall options: --destdir=DESTDIR installation root [default: ] -f, --force force file installation
Accessing a command-line option is possible from any command. Here is how to access the value prex:
top = . out = build_directory def configure(ctx): print( prefix is + ctx.options.prefix)
To dene project command-line options, a special command named options may be dened in user scripts. This command will be called once before any other command executes.
top = . out = build_directory def options(ctx): ctx.add_option(--foo, action=store, default=False, help=Silly test) def configure(ctx): print( the value of foo is %r % ctx.options.foo)
The command context for options is a shortcut to access the optparse functionality. For more information on the optparse module, consult the Python documentation
2.3
2.3.1
The build command is used for building targets. We will now create a new project in /tmp/execution_build/, and add a script to create an empty le foo.txt and then copy it into another le bar.txt:
top = . out = build_directory def configure(ctx): pass def build(ctx): ctx(rule=touch ${TGT}, target=foo.txt) ctx(rule=cp ${SRC} ${TGT}, source=foo.txt, target=bar.txt)
$ cd /tmp/execution_build/ $ waf build The project was not configured: run "waf configure" first!
The build requires a congured folder to know where to look for source les and where to output the created les. Lets try again:
$ waf configure build configure finished successfully (0.007s) Waf: Entering directory /tmp/execution_build/build_directory [1/2] foo.txt: -> build_directory/foo.txt 1v [2/2] bar.txt: build_directory/foo.txt -> build_directory/bar.txt Waf: Leaving directory /tmp/examples/execution_build/build_directory build finished successfully (0.041s) $ tree -a |-- build_directory/ | |-- bar.txt 2v | |-- c4che/ | | |-- build.config.py | | -- _cache.py | |-- foo.txt | |-- config.log | -- .wafpickle 3v |--.lock-wafbuild -- wscript $ waf build Waf: Entering directory /tmp/execution_build/build_directory Waf: Leaving directory /tmp/execution_build/build_directory build finished successfully (0.008s) 4v
Note that the build deduced that bar.txt has to be created after foo.txt The targets are created in the build directory A pickle le is used to store the information about the targets Since the targets are up-to-date, they do not have to be created once again
v v v
Since the command waf build is usually executed very often, a shortcut is provided to call it implicitly:
$ waf Waf: Entering directory /tmp/execution_build/build_directory Waf: Leaving directory /tmp/execution_build/build_directory
2.3.2
The clean command is used to remove the information about the les and targets created during the build. It uses the same function build from the wscript les so there is no need to add a function named clean in the wscript le. After cleaning, the targets will be created once again even if they were up-to-date:
$ waf clean build -v clean finished successfully (0.003s) Waf: Entering directory /tmp/execution_build/build_directory 1v [1/2] foo.txt: -> build_directory/foo.txt 2v 14:58:34 runner touch foo.txt 3v [2/2] bar.txt: build_directory/foo.txt -> build_directory/bar.txt
14:58:34 runner cp foo.txt bar.txt Waf: Leaving directory /tmp/execution_build/build_directory build finished successfully (0.040s)
All commands are executed from the build directory by default The information about the les foo.txt was lost so it is rebuilt By using the -v ag, the command-lines executed are displayed
v v
2.3.3
The following commands all use the same function build from the wscript le: 1. build: process the source code to create the object les 2. clean: remove the object les that were created during a build (unlike distclean, do not remove the conguration) 3. install: check that all object les have been generated and copy them on the system (programs, libraries, data les, etc) 4. uninstall: undo the installation, remove the object les from the system without touching the ones in the build directory 5. list: list the task generators in the build section (to use with waf --targets=name) 6. step: force the rebuild of particular les for debugging purposes The attribute cmd holds the name of the command being executed:
top = . out = build_directory def configure(ctx): print(ctx.cmd) def build(ctx): if ctx.cmd == clean: print(cleaning!) else: print(ctx.cmd)
The build command usage will be described in details in the next chapters.
Chapter 3
Project conguration
The conguration command is used to check if the requiremements for working on a project are met and to store the information. The parameters are then stored for use by other commands, such as the build command.
3.1
3.1.1
The conguration context is used to store data which may be re-used during the build. Lets begin with the following example:
top = . out = build def options(ctx): ctx.add_option(--foo, action=store, default=False, help=Silly test) def configure(ctx): ctx.env.FOO = ctx.options.foo 1v ctx.find_program(touch, var=TOUCH)
Store the option foo into the variable env (dict-like structure) Conguration routine used to nd the program touch and to store it into ctx.env.TOUCH 1 Print the value of ctx.env.FOO that was set during the conguration The variable ${TOUCH} corresponds to the variable ctx.env.TOUCH.
v v v
may use the same variable from the OS environment during the search, for example CC=gcc waf congure
/usr/bin/touch 2v abcd [1/1] foo.txt: -> build/foo.txt 10:56:41 runner /usr/bin/touch foo.txt 3v Waf: Leaving directory /tmp/configuration_build/build build finished successfully (0.021s)
Output of the conguration test nd_program The value of TOUCH Command-line used to create the target foo.txt
v v
The variable ctx.env is called a Conguration set, and is an instance of the class CongSet. The class is a wrapper around Python dicts to handle serialization. For this reason it should be used for simple variables only (no functions or classes). The values are stored in a python-like format in the build directory:
$ tree build/ |-- foo.txt |-- c4che | |-- build.config.py | -- _cache.py -- config.log $ cat build/c4che/_cache.py FOO = abcd PREFIX = /usr/local TOUCH = /usr/bin/touch
Note Reading and writing values to ctx.env is possible in both conguration and build commands. Yet, the values are stored to a le only during the conguration phase.
3.1.2
We will now provide more examples of the conguration set usage. The object ctx.env provides convenience methods to access its contents:
top = . out = build def configure(ctx): ctx.env[CFLAGS] = [-g] 1v ctx.env.CFLAGS = [-g] 2v ctx.env.append_value(CXXFLAGS, [-O2, -g]) ctx.env.append_unique(CFLAGS, [-g, -O2]) ctx.env.prepend_value(CFLAGS, [-O3]) 4v print(type(ctx.env)) print(ctx.env) print(ctx.env.FOO)
Key-based access; storing a list Attribute-based access (the two forms are equivalent)
Append each element to the list ctx.env.CXXFLAGS, assuming it is a list Insert the values at the beginning. Note that there is no such method as prepend_unique
v 4
The object conf.env is an instance of the class CongSet dened in waib/CongSet.py The contents of conf.env after the modications When a key is undened, it is assumed that it is a list (used by append_value above) The object conf.env is stored by default in this le
v v
v 3
4
v
2
node = ctx.path.make_node(test.txt) env_copy.store(node.abspath()) 3v from waflib.ConfigSet import ConfigSet env2 = ConfigSet() 4v env2.load(node.abspath()) 5v print(node.read())
6
Make a copy of ctx.env - this is a shallow copy Use ctx.path to create a node object representing the le test.txt Store the contents of env_copy into test.txt Create a new empty CongSet object Load the values from test.txt Print the contents of test.txt
v 3
4
v 5
6
3.2
3.2.1
Conguration utilities
Conguration methods
The method ctx.nd_program seen previously is an example of a conguration method. Here are more examples:
top = . out = build def configure(ctx): ctx.find_program(touch, var=TOUCH) ctx.check_waf_version(mini=1.6.8) ctx.find_file(fstab, [/opt, /etc])
Although these methods are provided by the context class waib.Congure.CongurationContext, they will not appear on it in API documentation. For modularity reasons, they are dened as simple functions and then bound dynamically:
top = . out = build from waflib.Configure import conf @conf 2v def hi(ctx): print( hello, world!) # hi = conf(hi)
3 1
Import the decorator conf Use the decorator to bind the method hi to the conguration context and build context classes. In practice, the conguration methods are only used during the conguration phase. Decorators are simple python function. Python 2.3 does not support the @ syntax so the function has to be called after the function declaration Use the method previously bound to the conguration context class
v v
3.2.2
For efciency reasons, only a few conguration methods are present in the Waf core. Most conguration methods are loaded by extensions called Waf tools. The main tools are located in the folder waflib/Tools, and the tools in testing phase are located under the folder waflib/extras. Yet, Waf tools may be used from any location on the lesystem. We will now demonstrate a very simple Waf tool named dang.py which will be used to set ctx.env.DANG from a command-line option:
#! /usr/bin/env python # encoding: utf-8 print( loading the dang tool) from waflib.Configure import conf def options(opt): 1v opt.add_option(--dang, action=store, default=, dest=dang) @conf def read_dang(ctx): 2v ctx.start_msg(Checking for the variable DANG) if ctx.options.dang: ctx.env.DANG = ctx.options.dang 3v ctx.end_msg(ctx.env.DANG) else: ctx.end_msg(DANG is not set) def configure(ctx): 4v ctx.read_dang()
Provide command-line options Bind the function read_dang as a new conguration method to call ctx.read_dang() below Set an persistent value from the current command-line options Provide a command named congure accepting a build context instance as parameter
v v v
For loading a tool, the method load must be used during the conguration:
top = . out = build def options(ctx): ctx.load(dang, tooldir=.) def configure(ctx): ctx.load(dang, tooldir=.) def build(ctx): print(ctx.env.DANG)
v v
v v v
Load the options dened in dang.py Load the tool dang.py. By default, load calls the method congure dened in the tools. The tool modies the value of ctx.env.DANG during the conguration
First the tool is imported as a python module, and then the method congure is called by load The tools loaded during the conguration will be loaded during the build phase
3.2.3
Multiple congurations
The conf.env object is an important point of the conguration which is accessed and modied by Waf tools and by user-provided conguration functions. The Waf tools do not enforce a particular structure for the build scripts, so the tools will only modify the contents of the default object. The user scripts may provide several env objects in the conguration and pre-set or post-set specic values:
def configure(ctx): env = ctx.env 1v ctx.setenv(debug) 2v ctx.env.CC = gcc 3v ctx.load(gcc) ctx.setenv(release, env) ctx.load(msvc) ctx.env.CFLAGS = [/O2] print ctx.all_envs[debug]
4
v v
Save a reference to conf.env Copy and replace conf.env Modify conf.env Copy and replace conf.env again, from the initial data Recall a conguration set by its name
v v
3.3
3.3.1
Exception handling
Launching and catching conguration exceptions
Conguration helpers are methods provided by the conf object to help nd parameters, for example the method conf.nd_program
top = . out = build def configure(ctx): ctx.find_program(some_app)
When a test cannot complete properly, an exception of the type waib.Errors.CongurationError is raised. This often occurs when something is missing in the operating system environment or because a particular condition is not satised. For example:
$ waf Checking for program some_app : not found error: The program some_app could not be found
top = . out = build def configure(ctx): ctx.fatal("Im sorry Dave, Im afraid I cant do that")
For convenience, the module waib.Errors is bound to ctx.errors Adding information to the log le
v 2
The conguration completes without errors The log le contains useful information about the conguration execution Our log entry
v v
Catching the errors by hand can be inconvenient. For this reason, all @conf methods accept a parameter named mandatory to suppress conguration errors. The code snippet is therefore equivalent to:
top = . out = build def configure(ctx): ctx.find_program(some_app, mandatory=False)
As a general rule, clients should never rely on exit codes or returned values and must catch conguration exceptions. The tools should always raise conguration errors to display the errors and to give a chance to the clients to process the exceptions.
3.3.2
Transactions
Waf tools called during the conguration may use and modify the contents of conf.env at will. Those changes may be complex to track and to undo. Fortunately, the conguration exceptions make it possible to simplify the logic and to go back to a previous state easily. The following example illustrates how to use a transaction to to use several tools at once:
top = . out = build def configure(ctx): for compiler in (gcc, msvc): try: ctx.env.stash() ctx.load(compiler) except ctx.errors.ConfigurationError: ctx.env.revert() else: break else: ctx.fatal(Could not find a compiler)
Though several calls to stash can be made, the copies made are shallow, which means that any complex object (such as a list) modication will be permanent. For this reason, the following is a conguration anti-pattern:
def configure(ctx): ctx.env.CFLAGS += [-O2]
Chapter 4
Builds
We will now provide a detailed description of the build phase, which is used for processing the build targets.
4.1
4.1.1
To illustrate the various concepts that are part of the build process, we are now going to use a new example. The les foo.txt and bar.txt will be created by copying the le wscript, and the le foobar.txt will be created from the concatenation of the generated les. Here is a summary: 1
cp: wscript -> foo.txt cp: wscript -> bar.txt cat: foo.txt, bar.txt -> foobar.txt
Each of the three lines represents a command to execute. While the cp commands may be executed in any order or even in parallel, the cat command may only be executed after all the others are done. The constraints on the build order are represented on the following Directed acyclic graph:
cp: wscript -> foo.txt cp: wscript -> bar.txt
When the wscript input le changes, the foo.txt output le has to be created once again. The le foo.txt is said to depend on the wscript le. The le dependencies can be represented by a Direct acyclic graph too:
wscript
foo.txt
bar.txt
foobar.txt
1 The
examples are provided for demonstration purposes. It is actually considered a best practice to avoid copying les
Building a project consists in executing the commands according to a schedule which will respect these constraints. Faster build will be obtained when commands are executed in parallel (by using the build order), and when commands can be skipped (by using the dependencies). In Waf, the commands are represented by task objects. The dependencies are used by the task classes, and may be le-based or abstract to enforce particular constraints.
4.1.2
We will now represent the build from the previous section by declaring the tasks directly in the build section:
def configure(ctx): pass from waflib.Task import Task class cp(Task): 1v def run(self): 2v return self.exec_command(cp %s %s % ( self.inputs[0].abspath(), self.outputs[0].abspath() ) )
class cat(Task): def run(self): return self.exec_command(cat %s %s > %s % ( self.inputs[0].abspath(), self.inputs[1].abspath(), self.outputs[0].abspath() ) ) def build(ctx): cp_1 = cp(env=ctx.env) 4v cp_1.set_inputs(ctx.path.find_resource(wscript)) 5v cp_1.set_outputs(ctx.path.find_or_declare(foo.txt)) ctx.add_to_group(cp_1) 6v cp_2 = cp(env=ctx.env) cp_2.set_inputs(ctx.path.find_resource(wscript)) cp_2.set_outputs(ctx.path.find_or_declare(bar.txt)) ctx.add_to_group(cp_2) cat_1 = cat(env=ctx.env) cat_1.set_inputs(cp_1.outputs + cp_2.outputs) cat_1.set_outputs(ctx.path.find_or_declare(foobar.txt)) ctx.add_to_group(cat_1)
Task class declaration Waf tasks have a method named run to generate the targets Instances of waib.Task.Task have input and output objects representing the les to use (Node objects) Create a new task instance manually Set input and output les represented as waib.Node.Node objects Add the task to the build context for execution (but do not execute them immediately)
v v v v v
$ waf build Waf: Entering directory /tmp/build_manual_tasks/build [1/3] cp: wscript -> build/foo.txt 4v [2/3] cp: wscript -> build/bar.txt [3/3] cat: build/foo.txt build/bar.txt -> build/foobar.txt Waf: Leaving directory /tmp/build_manual_tasks/build build finished successfully (0.043s)
The tasks are not executed in the clean command The build keeps track of the les that were generated to avoid generating them again Modify one of the source les Rebuild according to the dependency graph
v v v
Please remember: 1. The execution order was computed automatically, by using the le inputs and outputs set on the task instances 2. The dependencies were computed automatically (the les were rebuilt when necessary), by using the node objects (hashes of the le contents were stored between the builds and then compared) 3. The tasks that have no order constraints are executed in parallel by default
4.1.3
Declaring the tasks directly is tedious and results in lengthy scripts. Feature-wise, the following is equivalent to the previous example:
def configure(ctx): pass def build(ctx): ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=foo.txt) ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=bar.txt) ctx(rule=cat ${SRC} > ${TGT}, source=foo.txt bar.txt, target=foobar.txt)
The ctx(. . . ) call is a shortcut to the class waib.TaskGen.task_gen, instances of this class are called task generator objects. The task generators are lazy containers and will only create the tasks and the task classes when they are actually needed:
def configure(ctx): pass def build(ctx): tg = ctx(rule=touch ${TGT}, target=foo) print(type(tg)) print(tg.tasks) tg.post() print(tg.tasks) print(type(tg.tasks[0]))
Task generator type The tasks created are stored in the list tasks (0..n tasks may be added) Tasks are created after calling the method post() - it is usually called automatically internally A new task class was created dynamically for the target foo
v v v
4.1.4
A high level overview of the build process is represented on the following diagram:
Note The tasks are all created before any of them is executed. New tasks may be created after the build has started, but the dependencies have to be set by using low-level apis.
4.2
Although any operation can be executed as part of a task, a few scenarios are typical and it makes sense to provide convenience functions for them.
4.2.1
User functions may be bound to be executed at two key moments during the build command (callbacks): 1. immediately before the build starts (bld.add_pre_fun) 2. immediately after the build is completed successfully (bld.add_post_fun) Here is how to execute a test after the build is nished:
top = . out = build def options(ctx): ctx.add_option(--exe, action=store_true, default=False, help=execute the program after it is built) def configure(ctx): pass def pre(ctx): 1v print(before the build is started) def post(ctx): print(after the build is complete) if ctx.cmd == install: 2v if ctx.options.exe: 3v ctx.exec_command(/sbin/ldconfig) def build(ctx): ctx.add_pre_fun(pre) 5v ctx.add_post_fun(post)
v v v v v
The callbacks take the build context as unique parameter ctx Access the command type Access to the command-line options A common scenario is to call ldcong after the les are installed. Scheduling the functions for later execution. Python functions are objects too.
$ waf distclean configure build install --exe distclean finished successfully (0.005s) configure finished successfully (0.011s) Waf: Entering directory /tmp/build_pre_post/build before the build is started 1v Waf: Leaving directory /tmp/build_pre_post/build after the build is complete 2v build finished successfully (0.004s) Waf: Entering directory /tmp/build_pre_post/build before the build is started Waf: Leaving directory /tmp/build_pre_post/build after the build is complete /sbin/ldconfig: Cant create temporary cache file /etc/ld.so.cache~: Permission denied install finished successfully (15.730s)
output of the function bound by bld.add_pre_fun output of the function bound by bld.add_post_fun execution at installation time
v v
4.2.2
Installing les
Three build context methods are provided for installing les created during or after the build: 1. install_les: install several les in a folder 2. install_as: install a target with a different name 3. symlink_as: create a symbolic link on the platforms that support it
def build(bld): bld.install_files(${PREFIX}/include, [a1.h, a2.h]) 1v bld.install_as(${PREFIX}/dir/bar.png, foo.png) 2v bld.symlink_as(${PREFIX}/lib/libfoo.so.1, libfoo.so.1.2.3)
env_foo = bld.env.derive() env_foo.PREFIX = /opt bld.install_as(${PREFIX}/dir/test.png, foo.png, env=env_foo) start_dir = bld.path.find_dir(src/bar) bld.install_files(${PREFIX}/share, [foo/a1.h], cwd=start_dir, relative_trick=True) 5v
Install various les in the target destination Install one le, changing its name Create a symbolic link Overridding the conguration set (env is optional in the three methods install_les, install_as and symlink_as) Install src/bar/foo/a1.h as seen from the current script into ${PREFIX}/share/foo/a1.h Install the png les recursively, preserving the folder structure read from src/bar/
v v v
v v
Note the methods install_les, install_as and symlink_as will do something only during waf install or waf uninstall, they have no effect in other build commands
4.2.3
The list command is used to display the task generators that are declared:
top = . out = build def configure(ctx): pass def build(ctx): ctx(source=wscript, target=foo.txt, rule=cp ${SRC} ${TGT}) ctx(target=bar.txt, rule=touch ${TGT}, name=bar)
By default, the name of the task generator is computed from the target attribute:
$ waf configure list configure finished successfully (0.005s) foo.txt bar list finished successfully (0.008s)
The main usage of the name values is to force a partial build with the --targets option. Compare the following:
$ waf clean build clean finished successfully (0.003s) Waf: Entering directory /tmp/build_list/build [1/2] foo.txt: wscript -> build/foo.txt [2/2] bar: -> build/bar.txt Waf: Leaving directory /tmp/build_list/build build finished successfully (0.028s) $ waf clean build --targets=foo.txt clean finished successfully (0.003s) Waf: Entering directory /tmp/build_list/build [1/1] foo.txt: wscript -> build/foo.txt Waf: Leaving directory /tmp/build_list/build build finished successfully (0.022s)
4.2.4
The step is used to execute specic tasks and to return the exit status and any error message. It is particularly useful for debugging:
waf step --files=test_shlib.c,test_staticlib.c Waf: Entering directory /tmp/demos/c/build c: shlib/test_shlib.c -> build/shlib/test_shlib.c.1.o -> 0 cshlib: build/shlib/test_shlib.c.1.o -> build/shlib/libmy_shared_lib.so -> 0 c: stlib/test_staticlib.c -> build/stlib/test_staticlib.c.1.o -> 0 cstlib: build/stlib/test_staticlib.c.1.o -> build/stlib/libmy_static_lib.a -> 0 Waf: Leaving directory /tmp/demos/c/build step finished successfully (0.201s)
In this case the .so les were also rebuilt. Since the les attribute is interpreted as a comma-separated list of regular expressions, the following will produce a different output:
$ waf step --files=test_shlib.c$ Waf: Entering directory /tmp/demos/c/build c: shlib/test_shlib.c -> build/shlib/test_shlib.c.1.o -> 0 Waf: Leaving directory /tmp/demos/c/build step finished successfully (0.083s)
Finally, the tasks to execute may be prexed by in: or out: to specify if it is a source or a target le:
$ waf step --files=out:build/shlib/test_shlib.c.1.o Waf: Entering directory /tmp/demos/c/build cc: shlib/test_shlib.c -> build/shlib/test_shlib.c.1.o -> 0 Waf: Leaving directory /tmp/demos/c/build step finished successfully (0.091s)
Note when using waf step, all tasks are executed sequentially, even if some of them return a non-zero exit status
Chapter 5
Node objects
Node objects represent les or folders and are used to ease the operations dealing with the le system. This chapter provides an overview of their usage.
5.1
5.1.1
The Waf nodes inherit the class waib.Node.Node and provide a tree structure to represent the le system: 1. parent: parent node 2. children: folder contents - or empty if the node is a le In practice, the reference to the lesystem tree is bound to the context classes for access from Waf commands. Here is an illustration:
top = . out = build def configure(ctx): pass def dosomething(ctx): print(ctx.path.abspath()) 1v print(ctx.root.abspath()) 2v print("ctx.path contents %r" % ctx.path.children) print("ctx.path parent %r" % ctx.path.parent.abspath()) print("ctx.root parent %r" % ctx.root.parent)
ctx.path represents the path to the wscript le being executed ctx.root is the root of the le system or the folder containing the drive letters (win32 systems)
/ ctx.path contents {wscript: /tmp/node_tree/wscript} ctx.path parent /tmp 3v ctx.root parent None 4v dosomething finished successfully (0.001s)
Absolute paths are used frequently The folder contents are stored in the dict children which maps names to node objects Each node keeps reference to his parent node The root node has no parent
v v v
Note There is a strict correspondance between nodes and lesystem elements: a node represents exactly one le or one folder, and only one node can represent a le or a folder.
5.1.2
Node caching
The lesystem root appears to only contain one node, although the real lesystem root contains more folders than just /tmp:
$ waf configure dosomething Setting top to : /tmp/nodes_cache Setting out to : /tmp/nodes_cache/build configure finished successfully (0.086s) {tmp: /tmp} dosomething finished successfully (0.001s) $ ls / bin boot dev etc home tmp usr var
This means in particular that some nodes may have to be read from the le system or created before being used.
5.2
5.2.1
General usage
Searching and creating nodes
Nodes may be created manually or read from the le system. Three methods are provided for this purpose:
def configure(ctx): pass def dosomething(ctx): print(ctx.path.find_node(wscript)) nd1 = ctx.path.make_node(foo.txt) print(nd1)
2
v v
print(ctx.path.listdir())
Search for a node by reading the le system Search for a node or create it if it does not exist Search for a node but do not try to create it Search for a le which does not exist Write to the le pointed by the node, creating or overwriting the le
v 2
3
v v
v 4
5
Warning These methods are not safe for concurrent access. The node class methods are not meant to be thread-safe.
5.2.2
The method ant_glob is called on a node object, and not on the build context, it returns only les by default Patterns may contain wildcards such as * or ?, but they are Ant patterns, not regular expressions The symbol ** enable recursion. Complex folder hierarchies may take a lot of time, so use with care. Even though recursion is enabled, only les are returned by default. To turn directory listing on, use dir=True Patterns are either lists of strings or space-delimited values. Patterns to exclude are dened in waib.Node.exclude_regs.
v v
v v
The sequence .. represents exactly two dot characters, and not the parent directory. This is used to guarantee that the search will terminate, and that the same les will not be listed multiple times. Consider the following: ctx.path.ant_glob(../wscript) 1v ctx.path.parent.ant_glob(wscript) 2v v
Invalid, this pattern will never return anything Call ant_glob from the parent directory
5.2.3
The method abspath is used to obtain the absolute path for a node. In the following example, three nodes are used:
top = . out = build def configure(conf): pass def build(ctx): dir = ctx.path 1v src = ctx.path.find_resource(wscript) bld = ctx.path.find_or_declare(out.out) print(src.abspath(ctx.env)) print(bld.abspath(ctx.env)) print(dir.abspath(ctx.env)) print(dir.abspath())
2
v v
Directory node, source node and build node Computing the absolute path for source node or a build node takes a conguration set as parameter Computing the absolute path for a directory may use a conguration set or not
v v
$ waf distclean configure build distclean finished successfully (0.002s) configure finished successfully (0.005s) Waf: Entering directory /tmp/nested/build /tmp/nested/wscript 1v /tmp/nested/build/out.out 2v /tmp/nested/build/ 3v /tmp/nested 4v Waf: Leaving directory /tmp/nested/build build finished successfully (0.003s)
Absolute path for the source node The absolute path for the build node depends on the variant in use When a conguration set is provided, the absolute path for a directory node is the build directory representation including the variant When no conguration set is provided, the directory node absolute path is the one for the source directory
v v v
Note Several other methods such as relpath_gen or srcpath are provided. See the API documentation
5.3
5.3.1
BuildContext-specic methods
Source and build nodes
Although the sources and targets in the wscript les are declared as if they were in the current directory, the target les are output into the build directory. To enable this behaviour, the directory structure below the top directory must be replicated in the out directory. For example, the folder program from demos/c has its equivalent in the build directory:
$ cd demos/c $ tree . |-- build | |-- c4che | | |-- build.config.py | | -- _cache.py | |-- config.h | |-- config.log | -- program | |-- main.c.0.o | -- myprogram |-- program | |-- a.h | |-- main.c | -- wscript_build -- wscript
To support this, the build context provides two additional nodes: 1. srcnode: node representing the top-level directory 2. bldnode: node representing the build directory To obtain a build node from a src node and vice-versa, the following methods may be used:
1. Node.get_src() 2. Node.get_bld()
5.3.2
Although using srcnode and bldnode directly is possible, the three following wrapper methods are much easier to use. They accept a string representing the target as input and return a single node: 1. nd_dir: returns a node or None if the folder cannot be found on the system. 2. nd_resource: returns a node under the source directory, a node under the corresponding build directory, or None if no such a node exists. If the le is not in the build directory, the node signature is computed and put into a cache (le contents hash). 3. nd_or_declare: returns a node or create the corresponding node in the build directory. Besides, they all use nd_dir internally which will create the required directory structure in the build directory. Because the folders may be replicated in the build directory before the build starts, it is recommended to use it whenever possible:
def build(bld): p = bld.path.parent.find_dir(src) p = bld.path.find_dir(../src) 2v
1
Not recommended, use nd_dir instead Path separators are converted automatically according to the platform.
5.3.3
As seen in the previous chapter, Task objects can process les represented as lists of input and output nodes. The task generators will usually process the input les given as strings to obtain such nodes and bind them to the tasks. Since the build directory can be enabled or disabled, the following le copy is invalid:
1
To actually copy a le into the corresponding build directory with the same name, the ambiguity must be removed:
def build(bld): bld( rule = cp ${SRC} ${TGT}, source = bld.path.make_node(foo.txt), target = bld.path.get_bld().make_node(foo.txt) )
1 When
Chapter 6
Custom commands
Context inheritance
An instance of the class waib.Context.Context is used by default for the custom commands. To provide a custom context object it is necessary to create a context subclass:
def configure(ctx): print(type(ctx)) def foo(ctx): 1v print(type(ctx)) def bar(ctx): print(type(ctx)) from waflib.Context import Context class one(Context): cmd = foo class two(Context): cmd = tak fun = bar
v v
A custom command using the default context Bind a context class to the command foo Declare a new command named tak, but set it to call the script function bar
v v
bar finished successfully (0.001s) <class wscript.two> tak finished successfully (0.001s)
A typical application of custom context is subclassing the build context to use the conguration data loaded in ctx.env:
def configure(ctx): ctx.env.FOO = some data def build(ctx): print(build command) def foo(ctx): print(ctx.env.FOO) from waflib.Build import BuildContext class one(BuildContext): cmd = foo fun = foo
Note The build commands are using this system: waf install waib.Build.InstallContext, waf step waib.Build.StepContext, etc
6.1.2
Command composition
To re-use existing commands that have incompatible context classes, insert them in the command stack:
def configure(ctx): pass def build(ctx): pass def cleanbuild(ctx): from waflib import Options Options.commands = [clean, build] + Options.commands
This technique is useful for writing testcases. By executing waf test, the following script will congure a project, create source les in the source directory, build a program, modify the sources, and rebuild the program. In this case, the program must be rebuilt because a header (implicit dependency) has changed.
def options(ctx): ctx.load(compiler_c) def configure(ctx): ctx.load(compiler_c)
def setup(ctx): n = ctx.path.make_node(main.c) n.write(#include "foo.h"\nint main() {return 0;}\n) global v m = ctx.path.make_node(foo.h) m.write(int k = %d;\n % v) v += 1 def build(ctx): ctx.program(source=main.c, target=app) def test(ctx): global v v = 12
import Options 2v lst = [configure, setup, build, setup, build] Options.commands = lst + Options.commands
To share data between different commands, use a global variable The test command is used to add more commands
6.1.3
When the top-level wscript is read, it is converted into a python module and kept in memory. Commands may be added dynamically by injecting the desired function into that module. We will now show how to load a waf tool to count the amount of task generators in the project.
top = . out = build def options(opt): opt.load(some_tool, tooldir=.) def configure(conf): pass
Waf tools are loaded once for the conguration and for the build. To ensure that the tool is always enabled, it is mandatory to load its options, even if the tool does not actually provide options. Our tool some_tool.py, located next to the wscript le, will contain the following code:
from waflib import Context def cnt(ctx): """do something""" print(just a test) Context.g_module.__dict__[cnt] = cnt
6.2
6.2.1
The WAFLOCK environment variable is used to control the conguration lock and to point at the default build directory. Observe the results on the following project:
def configure(conf): pass def build(bld): bld(rule=touch ${TGT}, target=foo.txt)
|-|-| | | | | | |-| | | | | | --
.lock-release debug |-- .wafpickle-7 |-- c4che | |-- build.config.py | -- _cache.py |-- config.log -- foo.txt release |-- .wafpickle-7 |-- c4che | |-- build.config.py | -- _cache.py |-- config.log -- foo.txt wscript
The lock le points at the conguration of the project in use and at the build directory to use The les are output in the build directory debug The conguration release is used with a different lock le and a different build directory. The contents of the project directory contain the two lock les and the two build folders.
v v v
The lock le may also be changed from the code by changing the appropriate variable in the waf scripts:
from waflib import Options Options.lockfile = .lock-wafname
Note The output directory pointed at by the waf lock le only has effect when not given in the waf script
6.2.2
6.2.2.1
In the previous section, two different congurations were used for similar builds. We will now show how to inherit the same conguration by two different builds, and how to output the targets in different folders. Lets start with the project le:
def configure(ctx): pass def build(ctx): ctx(rule=touch ${TGT}, target=ctx.cmd + .txt) from waflib.Build import BuildContext class debug(BuildContext): 2v cmd = debug variant = debug 3v
The command being called is self.cmd Create the debug command inheriting the build context Declare a folder for targets of the debug command
v v
This projet declares two different builds build and debug. Lets examine the execution output:
waf configure build debug Setting top to : /tmp/advbuild_variant Setting out to : /tmp/advbuild_variant/build configure finished successfully (0.007s) Waf: Entering directory /tmp/advbuild_variant/build [1/1] build.txt: -> build/build.txt Waf: Leaving directory /tmp/advbuild_variant/build build finished successfully (0.020s) Waf: Entering directory /tmp/build_variant/build/debug [1/1] debug.txt: -> build/debug/debug.txt 1v Waf: Leaving directory /tmp/advbuild_variant/build/debug debug finished successfully (0.021s) $ tree . |-- build | |-- build.txt 2v | |-- c4che | | |-- build.config.py | | -- _cache.py | |-- config.log | -- debug | -- debug.txt 3v -- wscript
v v v
Commands are executed from build/variant The default build command does not have any variant The target debug is under the variant directory in the build directory
Conguration sets for variants
6.2.2.2
The variants may require different conguration sets created during the conguration. Here is an example:
def options(opt): opt.load(compiler_c) def configure(conf): conf.setenv(debug) 1v conf.load(compiler_c) conf.env.CFLAGS = [-g] conf.setenv(release) conf.load(compiler_c) conf.env.CFLAGS = [-O2] def build(bld): if not bld.variant: 3v bld.fatal(call "waf build_debug" or "waf build_release", and try "waf -- help") bld.program(source=main.c, target=app, includes=.) 4v from waflib.Build import BuildContext, CleanContext, \ InstallContext, UninstallContext for x in debug release.split(): for y in (BuildContext, CleanContext, InstallContext, UninstallContext): name = y.__name__.replace(Context,).lower()
v v v v
Create a new conguration set to be returned by conf.env, and stored in c4che/debug_cache.py Modify some data in the conguration set Make sure a variant is set, this will disable the normal commands build, clean and install bld.env will load the conguration set of the appropriate variant (debug_cache.py when in debug) Create new commands such as clean_debug or install_debug (the class name does not matter)
Chapter 7
Task processing
This chapter provides a description of the task classes which are used during the build phase.
7.1
7.1.1
Task execution
Main actors
The build context is only used to create the tasks and to return lists of tasks that may be executed in parallel. The scheduling is delegated to a task producer which lets task consumers to execute the tasks. The task producer keeps a record of the build state such as the amount of tasks processed or the errors.
The amount of consumers is determined from the number of processors, or may be set manually by using the -j option:
$ waf -j3
7.1.2
Build groups
The task producer iterates over lists of tasks returned by the build context. Although the tasks from a list may be executed in parallel by the consumer threads, all the tasks from one list must be consumed before processing another list of tasks. The build ends when there are no more tasks to process. These lists of tasks are called build groups and may be accessed from the build scripts. Lets demonstrate this behaviour on an example:
def build(ctx): for i in range(8): ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_a_%d % i, color=YELLOW, name=tasks a) ctx(rule=cp ${SRC} ${TGT}, source=wscript_a_%d % i, target=wscript_b_%d % i, color=GREEN, name=tasks b)
for i in range(8) ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_c_%d % i, color=BLUE, name=tasks c) ctx(rule=cp ${SRC} ${TGT}, source=wscript_c_%d % i, target=wscript_d_%d % i, color=PINK, name=tasks d)
Each green task must be executed after one yellow task and each pink task must be executed after one blue task. Because there is only one group by default, the parallel execution will be similar to the following:
We will now modify the example to add one more build group.
def build(ctx): for i in range(8): ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_a_%d % i, color=YELLOW, name=tasks a) ctx(rule=cp ${SRC} ${TGT}, source=wscript_a_%d % i, target=wscript_b_%d % i, color=GREEN, name=tasks b) ctx.add_group() for i in range(8): ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_c_%d % i, color=BLUE, name=tasks c) ctx(rule=cp ${SRC} ${TGT}, source=wscript_c_%d % i, target=wscript_d_%d % i, color=PINK, name=tasks d)
Now a separator will appear between the group of yellow and green tasks and the group of blue and violet taks:
The tasks and tasks generator are added implicitely to the current group. By giving a name to the groups, it is easy to control what goes where:
def build(ctx): ctx.add_group(group1) ctx.add_group(group2) for i in range(8): ctx.set_group(group1) ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_a_%d % i, color=YELLOW, name=tasks a)
ctx(rule=cp ${SRC} ${TGT}, source=wscript_a_%d % i, target=wscript_b_%d % i, color=GREEN, name=tasks b) ctx.set_group(group2) ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=wscript_c_%d % i, color=BLUE, name=tasks c) ctx(rule=cp ${SRC} ${TGT}, source=wscript_c_%d % i, target=wscript_d_%d % i, color=PINK, name=tasks d)
In the previous examples, all task generators from all build groups are processed before the build actually starts. This default is provided to ensure that the task count is as accurate as possible. Here is how to tune the build groups:
def build(ctx): from waflib.Build import POST_LAZY, POST_BOTH, POST_AT_ONCE ctx.post_mode = POST_AT_ONCE 1v #ctx.post_mode = POST_LAZY 2v #ctx.post_mode = POST_BOTH 3v
All task generators create their tasks before the build starts (default behaviour) Groups are processed sequentially: all tasks from previous groups are executed before the task generators from the next group are processed Combination of the two previous behaviours: task generators created by tasks in the next groups may create tasks
v v
Build groups can be used for building a compiler to generate more source les to process.
7.1.3
In most python interpreters, a global interpreter lock prevents parallelization by more than one cpu core at a time. Therefore, it makes sense to restrict the task scheduling on a single task producer, and to let the threads access only the task execution. The communication between producer and consumers is based on two queues ready and out. The producer adds the tasks to ready and reads them back from out. The consumers obtain the tasks from ready and give them back to the producer into out after executing task.run. The producer uses the an internal list named outstanding to iterate over the tasks and to decide which ones to put in the queue ready. The tasks that cannot be processed are temporarily output in the list frozen to avoid looping endlessly over the tasks waiting for others. The following illustrates the relationship between the task producers and consumers as performed during the build:
7.1.4
A state is assigned to each task (task.hasrun = state) to keep track of the execution. The possible values are the following: State NOT_RUN MISSING CRASHED EXCEPTION SKIPPED SUCCESS Numeric value 0 1 2 3 8 9 Description The task has not been processed yet The task outputs are missing The task method run returned a non-0 value An exception occured in the Task method run The task was skipped (it was up-to-date) The execution was successful
To decide to execute a task or not, the producer uses the value returned by the task method runnable_status. The possible return values are the following: Code ASK_LATER SKIP_ME RUN_ME Description The task may depend on other tasks which have not nished to run (not ready) The task does not have to be executed, it is up-to-date The task is ready to be executed
The following diagram represents the interaction between the main task methods and the states and status:
7.2
7.2.1
The tasks to wait for are stored in the attribute run_after. They are used by the method runnable_status to yield the status ASK_LATER when a task has not run yet. This is merely for the build order and not for forcing a rebuild if one of the previous tasks is executed.
7.2.2
7.2.2.1
Computed constraints
Attribute after/before
The attributes before and after are used to declare ordering constraints between tasks:
from waflib.Task import TaskBase class task_test_a(TaskBase): before = [task_test_b] class task_test_b(TaskBase): after = [task_test_a]
7.2.2.2
ext_in/ext_out
Another way to force the order is by declaring lists of abstract symbols on the class attributes. This way the classes are not named explicitly, for example:
from waflib.Task import TaskBase class task_test_a(TaskBase): ext_in = [.h] class task_test_b(TaskBase): ext_out = [.h]
The extensions ext_in and ext_out do not mean that the tasks have to produce les with such extensions, but are mere symbols for use as precedence constraints.
7.2.2.3 Order extraction
Before feeding the tasks to the producer-consumer system, a constraint extraction is performed on the tasks having input and output les. The attributes run_after are initialized with the tasks to wait for. The two functions called on lists of tasks are: 1. waib.Task.set_precedence_constraints: extract the build order from the task classes attributes ext_in/ext_out/before/after 2. waib.Task.set_le_constraints: extract the constraints from the tasks having input and output les
7.2.3
Tasks that are known to take a lot of time may be launched rst to improve the build times. The general problem of nding an optimal order for launching tasks in parallel and with constraints is called Job Shop. In practice this problem can often be reduced to a critical path problem (approximation). The following pictures illustrate the difference in scheduling a build with different independent tasks, in which a slow task is clearly identied, and launched rst:
def build(ctx): for x in range(5): ctx(rule=sleep 1, color=GREEN, name=short task) ctx(rule=sleep 5, color=RED, name=long task)
No particular order A function is used to reorder the tasks from a group before they are passed to the producer. We will replace it to reorder the long task in rst position:
from waflib import Task old = Task.set_file_constraints def meth(lst): lst.sort(cmp=lambda x, y: cmp(x.__class__.__name__, y.__class__.__name__)) old(lst) 2v Task.set_file_constraints = meth 3v
Set the long task in rst position Execute the original code Replace the method
v v
7.3
7.3.1
Dependencies
Task signatures
The direct instances of waib.Task.TaskBase are very limited and cannot be used to track le changes. The subclass waib.Task.Task provides the necessary features for the most common builds in which source les are used to produce target les. The dependency tracking is based on the use of hashes of the dependencies called task signatures. The signature is computed from various dependencies source, such as input les and conguration set values. The following diagram describes how waib.Task.Task instances are processed:
The following data is used in the signature computation: 1. Explicit dependencies: input nodes and dependencies set explicitly by using bld.depends_on 2. Implicit dependencies: dependencies searched by a scanner method (the method scan) 3. Values: conguration set values such as compilation ags
7.3.2
7.3.2.1
Explicit dependencies
Input and output nodes
The task objects do not directly depend on other tasks. Other tasks may exist or not, and be executed or nodes. Rather, the input and output nodes hold themselves signatures values, which come from different sources:
1. Nodes for build les usually inherit the signature of the task that generated the le 2. Nodes from elsewhere have a signature computed automatically from the le contents (hash)
7.3.2.2 Global dependencies on other nodes
The tasks may be informed that some les may depend on other les transitively without listing them in the inputs. This is achieved by the method add_manual_dependency from the build context:
def configure(ctx): pass def build(ctx): ctx(rule=cp ${SRC} ${TGT}, source=wscript, target=somecopy) ctx.add_manual_dependency( ctx.path.find_node(wscript), ctx.path.find_node(testfile))
The le somecopy will be rebuilt whenever wscript or testle change, even by one character:
$ waf build Waf: Entering directory /tmp/tasks_manual_deps/build [1/1] somecopy: wscript -> build/somecopy Waf: Leaving directory /tmp/tasks_manual_deps/build build finished successfully (0.034s) $ waf Waf: Entering directory /tmp/tasks_manual_deps/build Waf: Leaving directory /tmp/tasks_manual_deps/build build finished successfully (0.006s) $ echo " " >> testfile $ waf Waf: Entering directory /tmp/tasks_manual_deps/build [1/1] somecopy: wscript -> build/somecopy Waf: Leaving directory /tmp/tasks_manual_deps/build build finished successfully (0.022s)
7.3.3
Some tasks can be created dynamically after the build has started, so the dependencies cannot be known in advance. Task subclasses can provide a method named scan to obtain additional nodes implicitly. In the following example, the copy task provides a scanner method to depend on the wscript le found next to the input le.
import time from waflib.Task import Task class copy(Task): def run(self): return self.exec_command(cp %s %s % (self.inputs[0].abspath(), self.outputs[0]. abspath())) def scan(self): 1v print( calling the scanner method) node = self.inputs[0].parent.find_resource(wscript) return ([node], time.time()) 2v def runnable_status(self):
ret = super(copy, self).runnable_status() 3v bld = self.generator.bld 4v print(nodes: %r % bld.node_deps[self.uid()]) 5v print(custom data: %r % bld.raw_deps[self.uid()]) 6v return ret def configure(ctx): pass def build(ctx): tsk = copy(env=ctx.env) 7v tsk.set_inputs(ctx.path.find_resource(a.in)) tsk.set_outputs(ctx.path.find_or_declare(b.out)) ctx.add_to_group(tsk)
A scanner method The return value is a tuple containing a list of nodes to depend on and serializable data for custom uses Override the method runnable_status to add some logging Obtain a reference to the build context associated to this task The nodes returned by the scanner method are stored in the map bld.node_deps The custom data returned by the scanner method is stored in the map bld.raw_deps Create a task manually (encapsulation by task generators will be described in the next chapters)
v v v v v
v 7
$ waf calling the scanner method 1v nodes: [/tmp/tasks_scan/wscript] custom data: 55.51 [1/1] copy: a.in -> build/b.out build finished successfully (0.021s) $ waf 2v nodes: [/tmp/tasks_scan/wscript] custom data: 1280561555.512006 build finished successfully (0.005s) $ echo " " >> wscript
3
$ waf calling the scanner method nodes: [/tmp/tasks_scan/wscript] custom data: 64.31 [1/1] copy: a.in -> build/b.out build finished successfully (0.022s)
The scanner method is always called on a clean build The scanner method is not called when nothing has changed, although the data returned is retrieved When a dependency changes, the scanner method is executed once again (the custom data has changed)
v v
Warning If the build order is incorrect, the method scan may fail to nd dependent nodes (missing nodes) or the signature calculation may throw an exception (missing signature for dependent nodes).
7.3.4
Values
The habitual use of command-line parameters such as compilation ags lead to the creation of dependencies on values, and more specically the conguration set values. The Task class attribute vars is used to control what values can enter in the signature calculation. In the following example, the task created has no inputs and no outputs nodes, and only depends on the values.
from waflib.Task import Task class foo(Task): 1v vars = [FLAGS] 2v def run(self): print(the flags are %r % self.env.FLAGS)
def options(ctx): ctx.add_option(--flags, default=-f, dest=flags, type=string) def build(ctx): ctx.env.FLAGS = ctx.options.flags tsk = foo(env=ctx.env) ctx.add_to_group(tsk) def configure(ctx): pass
Create a task class named foo The task instances will be executed whenever self.env.FLAGS changes Print the value for debugging purposes Read the value from the command-line
v v
The task is executed on the rst run The dependencies have not changed, so the task is not executed The ags have changed so the task is executed
v v
7.4
7.4.1
Task tuning
Class access
def configure(ctx): ctx.env.COPY = /bin/cp ctx.env.COPYFLAGS = [-f] def build(ctx): from waflib.Task import Task class copy(Task): run_str = ${COPY} ${COPYFLAGS} ${SRC} ${TGT} print(copy.vars) tsk = copy(env=ctx.env) tsk.set_inputs(ctx.path.find_resource(wscript)) tsk.set_outputs(ctx.path.find_or_declare(b.out)) ctx.add_to_group(tsk)
It is assumed that run_str represents a command-line, and that the variables in ${} such as COPYFLAGS represent variables to add to the dependencies. A metaclass processes run_str to obtain the method run (called to execute the task) and the variables in the attribute vars (merged with existing variables). The function created is displayed in the following output:
$ waf --zones=action 13:36:49 action def f(tsk): env = tsk.env gen = tsk.generator bld = gen.bld wd = getattr(tsk, cwd, None) def to_list(xx): if isinstance(xx, str): return [xx] return xx lst = [] lst.extend(to_list(env[COPY])) lst.extend(to_list(env[COPYFLAGS])) lst.extend([a.path_from(bld.bldnode) for a in tsk.inputs]) lst.extend([a.path_from(bld.bldnode) for a in tsk.outputs]) lst = [x for x in lst if x] return tsk.exec_command(lst, cwd=wd, env=env.env or None) [1/1] copy: wscript -> build/b.out [COPY, COPYFLAGS] build finished successfully (0.007s)
All subclasses of waib.Task.TaskBase are stored on the module attribute waib.Task.classes. Therefore, the copy task can be accessed by using:
from waflib import Task cls = Task.classes[copy]
7.4.2
Scriptlet expressions
Although the run_str is aimed at conguration set variables, a few special cases are provided for convenience: 1. If the value starts by env, gen, bld or tsk, a method call will be made 2. If the value starts by SRC[n] or TGT[n], a method call to the input/output node n will be made 3. SRC represents the list of task inputs seen from the root of the build directory 4. TGT represents the list of task outputs seen from the root of the build directory Here are a few examples:
Absolute path of the parent folder of the task rst source le File system root Print the task unique identier Perform a map replacement equivalent to [env.CPPPATH_ST % x for x in env.INCPATHS]
v v v
7.4.3
7.4.3.1
The function waib.Task.always_run is used to force a task to be executed whenever a build is performed. It sets a method runnable_status that always return RUN_ME.
def configure(ctx): pass def build(ctx): from waflib import Task class copy(Task.Task): run_str = cp ${SRC} ${TGT} copy = waflib.Task.always_run(copy) tsk = copy(env=ctx.env) tsk.set_inputs(ctx.path.find_resource(wscript)) tsk.set_outputs(ctx.path.find_or_declare(b.out)) ctx.add_to_group(tsk)
For convenience, rule-based task generators can declare the always attribute to achieve the same results:
def build(ctx): ctx( rule = echo hello, always = True )
7.4.3.2
File hashes
To detect if the outputs have really been produced by a task, the task signature is used as the signature of the task nodes outputs. As a consequence, les created in the source directory are going to be rebuilt each time. To avoid this, the node signature should match the actual le contents. This is enforced by using the function waib.Task.update_outputs on task classes. It replaces the methods post_run and runnable_status of a task class to set the hash le contents to the output nodes. For convenience, rule-based task generators can declare the update_outputs attribute to achieve the same results:
def build(ctx): ctx( rule source target update_outputs )
= = = =
Chapter 8
Task generators
8.1 Rule-based task generators (Make-like)
This chapter illustrates the use of rule-based task generators for building simple targets.
8.1.1
Rule-based task generators are a particular category of task generators producing exactly one task. The following example shows a task generator producing the le foobar.txt from the project le wscript by executing the command cp to perform a copy:
top = . out = build def configure(conf): pass def build(bld): bld( 1v rule = cp ${SRC} ${TGT}, source = wscript, 3v target = foobar.txt, 4v )
v v v v
To instantiate a new task generator, remember that all arguments have the form key=value The attribute rule represents the command to execute in a readable manner (more on this in the next chapters). Source les, either in a space-delimited string, or in a list of python strings Target les, either in a space-delimited string, or in a list of python strings
build finished successfully (0.016s) $ tree . |-- build | |-- c4che | | |-- build.config.py | | -- _cache.py | |-- config.log | -- foobar.txt -- wscript $ waf 3v Waf: Entering directory /tmp/rules_simple/build Waf: Leaving directory /tmp/rules_simple/build build finished successfully (0.006s) $ echo " " >> wscript
4
$ waf Waf: Entering directory /tmp/rules_simple/build [1/1] foobar.txt: wscript build/foobar.txt 5v Waf: Leaving directory /tmp/rules_simple/build build finished successfully (0.013s)
In the rst execution, the target is correctly created Command-lines are only displayed in verbose mode by using the option -v The target is up-to-date, so the task is not executed Modify the source le in place by appending a space character Since the source has changed, the target is created once again
v v v
The string for the rule also enters in the dependency calculation. If the rule changes, then the task will be recompiled.
8.1.2
Rule functions
Rules may be given as expression strings or as python function. The function is assigned to the task class created:
top = . out = build def configure(conf): pass def build(bld): def run(task): 1v src = task.inputs[0].abspath() 2v tgt = task.outputs[0].abspath() 3v cmd = cp %s %s % (src, tgt) print(cmd) return task.exec_command(cmd) 4v bld( rule = run, 5v source = wscript, target = same.txt,
Rule functions take the task instance as parameter. Sources and targets are represented internally as Node objects bound to the task instance. Commands are executed from the root of the build directory. Node methods such as bldpath ease the command line creation. The task class holds a wrapper around subprocess.Popen(. . . ) to execute commands. Use a function instead of a string expression
v v
v v
The rule function must return a null value (0, None or False) to indicate success, and must generate the les corresponding to the outputs. The rule function is executed by threads internally so it is important to write thread-safe code (cannot search or create node objects). Unlike string expressions, functions may execute several commands at once.
8.1.3
Shell usage
The attribute shell is used to enable the system shell for command execution. A few points are worth keeping in mind when declaring rule-based task generators: 1. The Waf tools do not use the shell for executing commands 2. The shell is used by default for user commands and custom task generators 3. String expressions containing the following symbols >, < or & cannot be transformed into functions to execute commands without a shell, even if told to 4. In general, it is better to avoid the shell whenever possible to avoid quoting problems (paths having blank characters in the name for example) 5. The shell is creating a performance penalty which is more visible on win32 systems. Here is an example:
top = . out = build def configure(conf): pass def build(bld): bld(rule=cp ${SRC} ${TGT}, source=wscript, target=f1.txt, shell=False) bld(rule=cp ${SRC} ${TGT}, source=wscript, target=f2.txt, shell=True)
$ waf distclean configure build --zones=runner,action distclean finished successfully (0.004s) configure finished successfully (0.001s) Waf: Entering directory /tmp/rule/out 23:11:23 action 1v def f(task): env = task.env wd = getattr(task, cwd, None) def to_list(xx): if isinstance(xx, str): return [xx] return xx lst = [] lst.extend([cp]) lst.extend([a.srcpath(env) for a in task.inputs]) lst.extend([a.bldpath(env) for a in task.outputs]) lst = [x for x in lst if x] return task.exec_command(lst, cwd=wd) 23:11:23 action def f(task): env = task.env wd = getattr(task, cwd, None) p = env.get_flat cmd = cp %s %s % (" ".join([a.srcpath(env) for a in task.inputs]), " ".join([a.bldpath(env) for a in task.outputs])) return task.exec_command(cmd, cwd=wd) [1/2] f1.txt: wscript -> out/f1.txt 23:11:23 runner system command -> [cp, ../wscript, f1.txt] [2/2] f2.txt: wscript -> out/f2.txt 23:11:23 runner system command -> cp ../wscript f2.txt Waf: Leaving directory /tmp/rule/out build finished successfully (0.017s)
String expressions are converted to functions (here, without the shell). Command execution by the shell. Notice the heavy use of string concatenation. Commands to execute are displayed by calling waf --zones=runner. When called without the shell, the arguments are displayed as a list.
v v
Note For performance and maintainability, try avoiding the shell whenever possible
8.1.4
Source and target arguments are optional for make-like task generators, and may point at one or several les at once. Here are a few examples:
top = . out = build def configure(conf): pass def build(bld): bld( 1v
= = = =
v
source = wscript, rule = echo ${SRC}
) bld(
3
v
target = test.k3, rule = echo "test" > ${TGT},
) bld( ) bld(
5 4
v
rule = echo 1337
v
rule = "echo task always run", always = True
Generate two les whenever the input or the rule change. Likewise, a rule-based task generator may have multiple input les. The command is executed whenever the input or the rule change. There are no declared outputs. No input, the command is executed whenever it changes No input and no output, the command is executed only when the string expression changes No input and no output, the command is executed each time the build is called
v v v v
8.1.5
Dependencies on le contents
As a second example, we will create a le named r1.txt from the current date. It will be updated each time the build is executed. A second le named r2.txt will be created from r1.txt.
top = . out = build def configure(conf): pass def build(bld): bld( name target rule always ) bld( name target rule source after ) = = = = = r2, 4v r2.txt, cp ${SRC} ${TGT}, r1.txt, 5v r1, 6v = = = =
Give the task generator a name, it will create a task class of the same name to execute the command Create r1.txt with the date There is no source le to depend on and the rule never changes. The task is then set to be executed each time the build is started by using the attribute always If no name is provided, the rule is used as a name for the task class Use r1.txt as a source for r2.txt. Since r1.txt was declared before, the dependency will be added automatically (r2.txt will be re-created whenever r1.txt changes) Set the command generating r2.txt to be executed after the command generating r1.txt. The attribute after references task class names, not task generators. Here it will work because rule-based task generator tasks inherit the name attribute
v 3 v v v
Although r2 depends on r1.txt, r2 was not executed in the second build. As a matter of fact, the signature of the task r1 has not changed, and r1 was only set to be executed each time, regardless of its signature. Since the signature of the r1.txt does not change, the signature of r2 will not change either, and r2.txt is considered up-to-date. We will now illustrate how to make certain that the outputs reect the le contents and trigger the rebuild for dependent tasks by enabling the attribute on_results:
top = . out = build def configure(conf): pass def build(bld): bld( name = r1, target = r1.txt, rule = (date > ${TGT}) && cat ${TGT}, always = True, on_results = True, ) bld( target rule source after ) = = = = r2.txt, cp ${SRC} ${TGT}, r1.txt, r1,
Start with a clean build, both r1.txt and r2.txt are created
Notice the date and time The second build was executed at the same date and time, so r1.txt has not changed, therefore r2.txt is up to date The third build is executed at another date and time. Since r1.txt has changed, r2.txt is created once again
v v
8.2
8.2.1
The explicit rules described in the previous chapter become limited for processing several les of the same kind. The following code may lead to unmaintainable scripts and to slow builds (for loop):
def build(bld): for x in a.lua b.lua c.lua.split(): y = x.replace(.lua, .luac) bld(source=x, target=y, rule=${LUAC} -s -o ${TGT} ${SRC}) bld.install_files(${LUADIR}, x)
Rather, the rule should be removed from the user script, like this:
def build(bld): bld(source=a.lua b.lua c.lua)
The equivalent logic may then be provided by using the following code. It may be located in either the same wscript, or in a waf tool:
from waflib import TaskGen TaskGen.declare_chain( name = luac, 1v rule = ${LUAC} -s -o ${TGT} ${SRC}, shell = False, ext_in = .lua, 3v ext_out = .luac, 4v reentrant = False, 5v install_path = ${LUADIR}, 6v )
The name for the corresponding task class to use The rule is the same as for any rule-based task generator Input le, processed by extension Output les extensions separated by spaces. In this case there is only one output le The reentrant attribute is used to add the output les as source again, for processing by another implicit rule String representing the installation path for the output les, similar to the destination path from bld.install_les. To disable installation, set it to False.
v v v v v
8.2.2
Now consider the long chain uh.in uh.a uh.b uh.c. The following implicit rules demonstrate how to generate the les while maintaining a minimal user script:
top = . out = build def configure(conf): pass def build(bld): bld(source=uh.in) from waflib import TaskGen TaskGen.declare_chain(name=a, rule=cp ${SRC} ${TGT}, ext_in=.in, ext_out=.a,) TaskGen.declare_chain(name=b, rule=cp ${SRC} ${TGT}, ext_in=.a, ext_out=.b,) TaskGen.declare_chain(name=c, rule=cp ${SRC} ${TGT}, ext_in=.b, ext_out=.c, reentrant = False)
During the build phase, the correct compilation order is computed based on the extensions given:
$ waf distclean configure build distclean finished successfully (0.000s) configure finished successfully (0.090s) Waf: Entering directory /comp/waf/demos/simple_scenarios/chaining/build [1/3] a: uh.in -> build/uh.a [2/3] b: build/uh.a -> build/uh.b [3/3] c: build/uh.b -> build/uh.c Waf: Leaving directory /comp/waf/demos/simple_scenarios/chaining/build build finished successfully (0.034s)
8.2.3
Scanner methods
Because transformation chains rely on implicit transformations, it may be desirable to hide some les from the list of sources. Or, some dependencies may be produced conditionally and may not be known in advance. A scanner method is a kind of callback used to nd additional dependencies just before the target is generated. For illustration purposes, let us start with an empty project containing three les: the wscript, ch.in and ch.dep
$ cd /tmp/smallproject $ tree . |-- ch.dep |-- ch.in -- wscript
The build will create a copy of ch.in called ch.out. Also, ch.out must be rebuild whenever ch.dep changes. This corresponds more or less to the following Makele:
ch.out: ch.in ch.dep cp ch.in ch.out
The code below is independent from the user scripts and may be located in a Waf tool. def scan_meth(task): 1v
node = task.inputs[0] dep = node.parent.find_resource(node.name.replace(.in, .dep)) 2v if not dep: raise ValueError("Could not find the .dep file for %r" % node) return ([dep], []) 3v from waflib import TaskGen TaskGen.declare_chain( name = copy, rule = cp ${SRC} ${TGT}, ext_in = .in, ext_out = .out, reentrant = False, scan = scan_meth, 4v )
The scanner method accepts a task object as input (not a task generator) Use node methods to locate the dependency (and raise an error if it cannot be found) Scanner methods return a tuple containing two lists. The rst list contains the list of node objects to depend on. The second list contains private data such as debugging information. The results are cached between build calls so the contents must be serializable. Add the scanner method to chain declaration
v v
$ waf distclean configure build distclean finished successfully (0.001s) configure finished successfully (0.001s) Waf: Entering directory /tmp/smallproject/build [1/1] copy: ch.in -> build/ch.out 2v Waf: Leaving directory /tmp/smallproject/build build finished successfully (0.010s) $ waf Waf: Entering directory /tmp/smallproject/build Waf: Leaving directory /tmp/smallproject/build build finished successfully (0.005s) 3v $ echo 2 > ch.dep
4
$ waf Waf: Entering directory /tmp/smallproject/build [1/1] copy: ch.in -> build/ch.out 5v Waf: Leaving directory /tmp/smallproject/build build finished successfully (0.012s)
Execute a rst clean build. The le ch.out is produced The target ch.out is up-to-date because nothing has changed Change the contents of ch.dep The dependency has changed, so the target is rebuilt
v v
Here are a few important points about scanner methods: 1. they are executed only when the target is not up-to-date. 2. they may not modify the task object or the contents of the conguration set task.env 3. they are executed in a single main thread to avoid concurrency issues 4. the results of the scanner (tuple of two lists) are re-used between build executions (and it is possible to access programatically those results) 5. the make-like rules also accept a scan argument (scanner methods are bound to the task rather than the task generators) 6. they are used by Waf internally for c/c++ support, to add dependencies dynamically on the header les (.c .h)
8.2.4
Extension callbacks
In the chain declaration from the previous sections, the attribute reentrant was described to control if the generated les are to be processed or not. There are cases however where one of the two generated les must be declared (because it will be used as a dependency) but where it cannot be considered as a source le in itself (like a header in c/c\++). Now consider the following two chains (uh.in uh.a1 + uh.a2) and (uh.a1 uh.b) in the following example:
top = . out = build def configure(conf): pass def build(bld): obj = bld(source=uh.in) from waflib import TaskGen TaskGen.declare_chain( name = a, action = cp ${SRC} ${TGT}, ext_in = .in, ext_out = [.a1, .a2], reentrant = True, ) TaskGen.declare_chain( name = b, action = cp ${SRC} ${TGT}, ext_in = .a1, ext_out = .b, reentrant = False, )
$ waf distclean configure build distclean finished successfully (0.001s) configure finished successfully (0.001s) Waf: Entering directory /tmp/smallproject Waf: Leaving directory /tmp/smallproject Cannot guess how to process bld:///tmp/smallproject/uh.a2 (got mappings [.a1, .in] in class TaskGen.task_gen) -> try conf.load(..)?
The error message indicates that there is no way to process uh.a2. Only les of extension .a1 or .in can be processed. Internally, extension names are bound to callback methods. The error is raised because no such method could be found, and here is how to register an extension callback globally:
@TaskGen.extension(.a2) def foo(*k, **kw): pass
To register an extension callback locally, a reference to the task generator object must be kept:
def build(bld): obj = bld(source=uh.in) def callback(*k, **kw): pass obj.mappings[.a2] = callback
The exact method signature and typical usage for the extension callbacks is the following:
from waflib import TaskGen @TaskGen.extension(".a", ".b") 1v def my_callback(task_gen_object 2v node 3v , ): task_gen_object.create_task( task_name, 4v node, 5v output_nodes) 6v
Comma-separated list of extensions (strings) Task generator instance holding the data Instance of Node, representing a le (either source or build) The rst argument to create a task is the name of the task class The second argument is the input node (or a list of nodes for several inputs) The last parameter is the output node (or a list of nodes for several outputs)
v v v v v
The creation of new task classes will be described in the next section.
8.2.5
Waf tasks are instances of the class Task.TaskBase. Yet, the base class contains the real minimum, and the immediate subclass Task.Task is usually chosen in user scripts. We will now start over with a simple project containing only one project wscript le and and example le named ah.in. A task class will be added.
top = . out = build def configure(conf): pass
def build(bld): bld(source=uh.in) from waflib import Task, TaskGen @TaskGen.extension(.in) def process(self, node): tsk = self.create_task(abcd) print(tsk.__class__) class abcd(Task.Task): 2v def run(self): 3v print(executing...) return 0 4v
Create a new instance of abcd. The method create_task is a shortcut to make certain the task will keep a reference on its task generator. Inherit the class Task located in the module Task.py The method run is called when the task is executed The task return status must be an integer, which is zero to indicate success. The tasks that have failed will be executed on subsequent builds
v v v
Although it is possible to write down task classes in plain python, two functions (factories) are provided to simplify the work, for example: Task.simple_task_type( 1v xsubpp, 2v rule = ${PERL} ${XSUBPP} ${SRC} > ${TGT}, 3v color = BLUE, 4v before = cc) 5v
def build_it(task): return 0 Task.task_type_from_func( 6v sometask, 7v func = build_it, 8v vars = [SRT], color = RED, ext_in = .in, ext_out = .out) 9v
Task class name Rule to execute during the build Color for the output during the execution Execute the task instance before any instance of task classes named cc. The opposite of before is after Create a new task class from a custom python function. The vars attribute represents additional conguration set values to use as dependencies Task class name Function to use In this context, the extension names are meant to be used for computing the execution order with other tasks, without naming the other task classes explicitly
v v v v v v v
Note that most attributes are common between the two function factories. More usage examples may be found in most Waf tools.
8.2.6
The rst step in processing the source le attribute is to convert all le names into Nodes. Special methods may be mapped to intercept names by the exact le name entry (no extension). The Node objects are then added to the task generator attribute source. The list of nodes is then consumed by regular extension mappings. Extension methods may re-inject the output nodes for further processing by appending them to the the attribute source (hence the name re-entrant provided in declare_chain).
8.3
So far, various task generators uses have been demonstrated. This chapter provides a detailed description of task generator structure and usage.
8.3.1
The chapter on make-like rules illustrated how the attribute rule is processed. Then the chapter on name and extension-based le processing illustrated how the attribute source is processed (in the absence of the rule attribute). To process any attribute, the following properties should hold: 1. Attributes should be processed only when the task generator is set to generate the tasks (lazy processing) 2. There is no list of authorized attributes (task generators may be extended by user scripts) 3. Attribute processing should be controlable on a task generator instance basis (special rules for particular task generators) 4. The extensions should be split into independent les (low coupling between the Waf tools) Implementing such a system is a difcult problem which lead to the creation of very different designs: 1. A hierarchy of task generator subclasses It was abandoned due to the high coupling between the Waf tools: the C tools required knowledge from the D tool for building hybrid applications 2. Method decoration (creating linked lists of method calls) Replacing or disabling a method safely was no longer possible (addition-only), so this system disappeared quickly 3. Flat method and execution constraint declaration The concept is close to aspect-oriented programming and might scare programmers. So far, the third design proved to be the most exible and was kept. Here is how to dene a task generator method:
top = . out = build def configure(conf): pass def build(bld): v = bld(myattr=Hello, world!) v.myattr = Hello, world! 1v v.myMethod() 2v from waflib import TaskGen @TaskGen.taskgen_method 3v def myMethod(tgen): 4v print(getattr(self, myattr, None))
Attributes may be set by arguments or by accessing the object. It is set two times in this example. Call the task generator method explicitly Use a python decorator Task generator methods have a unique argument representing the current instance Process the attribute myattr when present (the case in the example)
v v v v
Note The method could be bound by using setattr directly, like for binding any new method on a python class.
8.3.2
So far, the task generator methods dened are only executed through explicit calls. Another decorator is necessary to have a task generator executed during the build phase automatically. Here is the updated example:
top = . out = build def configure(conf): pass def build(bld): bld(myattr=Hello, world!) from waflib import TaskGen @TaskGen.taskgen_method 1v @TaskGen.feature(*) 2v def methodName(self): print(getattr(self, myattr, None))
Bind a method to the task generator class (redundant when other methods such as TaskGen.feature are used) Bind the method to the symbol myfeature
The debugging zone task_gen is used to display the task generator methods being executed Display which task generator is being executed The method exec_rule is used to process the rule. It is always executed. The method process_source is used to process the source attribute. It is always executed exept if the method exec_rule processes a rule attribute Our task generator method is executed, and prints Hello, world! The task generator methods have been executed, the task generator is marked as done (posted)
v v
v v
8.3.3
So far, the task generator methods we added were declared to be executed by all task generator instances. Limiting the execution to specic task generators requires the use of the feature decorator:
top = . out = build def configure(conf): pass def build(bld): bld(features=ping) bld(features=ping pong) from waflib import TaskGen @TaskGen.feature(ping) def ping(self): print(ping) @TaskGen.feature(pong) def pong(self): print(pong)
Warning Although the task generator instances are processed in order, the task generator method execution requires a specic declaration for the order of execution. Here, the method pong is executed before the method ping
8.3.4
To control the execution order, two new decorators need to be added. We will now show a new example with two custom task generator methods method1 and method2, executed in that order:
top = . out = build def configure(conf): pass def build(bld): bld(myattr=Hello, world!) from waflib import TaskGen @TaskGen.feature(*) @TaskGen.before(process_source, exec_rule) def method1(self): print(method 1 %r % getattr(self, myattr, None)) @TaskGen.feature(*) @TaskGen.before(process_source) @TaskGen.after(method1) def method2(self): print(method 2 %r % getattr(self, myattr, None))
8.3.5
The order constraints on the methods (after/before), are used to sort the list of methods in the attribute meths. The sorting is performed once, and the list is consumed as methods are executed. Though no new feature may be added once the rst method is executed, new methods may be added dynamically in self.meths. Here is how to create an innite loop by adding the same method at the end:
from waflib.TaskGen import feature @feature(*) def infinite_loop(self): self.meths.append(infinite_loop)
8.3.6
We will now illustrate how task generator methods can be used to express abstract dependencies between task generator objects. Here is a new project le located under /tmp/targets/ :
top = . out = build def configure(conf): pass def build(bld): bld(rule=echo A, always=True, name=A) bld(rule=echo B, always=True, name=B)
By executing waf --targets=B, only the task generator B will create its tasks, and the output will be the following:
$ waf distclean configure build --targets=B distclean finished successfully (0.000s) configure finished successfully (0.042s)
Waf: Entering directory /tmp/targets/build [1/1] B: B Waf: Leaving directory /tmp/targets/build build finished successfully (0.032s)
Here is a way to ensure that the task generator A has created its tasks when B does:
top = . out = build def configure(conf): pass def build(bld): bld(rule=echo A, always=True, name=A) bld(rule=echo B, always=True, name=B, depends_on=A) from waflib.TaskGen import feature, before_method @feature(*) 1v @before_method(process_rule) def post_the_other(self): deps = getattr(self, depends_on, []) 2v for name in self.to_list(deps): other = self.bld.get_tgen_by_name(name) 3v print(other task generator tasks (before) %s % other.tasks) other.post() 4v print(other task generator tasks (after) %s % other.tasks)
This method will be executed for all task generators, before the attribute rule is processed Try to process the attribute depends_on, if present Obtain the task generator by name, and for the same variant Force the other task generator to create its tasks
v 3
4
The other task generator has not created any task yet A task generator creates all its tasks by calling its method post() Although --targets=B was requested, the task from target A was created and executed too
v v
In practice, the dependencies will often re-use the task objects created by the other task generator: node, conguration set, etc. This is used by the uselib system (see the next chapter on c/c++ builds).
Chapter 9
9.1
9.1.1
The C/C++ builds consist in transforming (compiling) source les into object les, and to assemble (link) the object les at the end. In theory a single programming language should be sufcient for writing any application, but the situation is usually more complicated: 1. Source les may be created by other compilers in other languages (IDL, ASN1, etc) 2. Additional les may enter in the link step (libraries, object les) and applications may be divided in dynamic or static libraries 3. Different platforms may require different processing rules (manifest les on MS-Windows, etc) To conceal the implementation details and the portability concerns, each target (program, library) can be wrapped as single task generator object as in the following example:
def options(opt): opt.load(compiler_c) def configure(conf): conf.load(compiler_c)
v v
def build(bld): bld.program(source=main.c, target=app, use=myshlib mystlib) bld.stlib(source=a.c, target=mystlib) 3v bld.shlib(source=b.c, target=myshlib, use=myobjects) 4v bld.objects(source=c.c, target=myobjects)
v v v v
Use compiler_c to load the c routines and to nd a compiler (for c++ use compiler_cxx and compiler_d for d) Declare a program built from main.c and using two other libraries Declare a static library Declare a shared library, using the objects from myobjects
The targets will have different extensions and names depending on the platform. For example on Linux, the contents of the build directory will be:
$ tree build build/ |-- c4che | |-- build.config.py | -- _cache.py |-- a.c.1.o |-- app 1v |-- b.c.2.o |-- c.c.3.o |-- config.log |-- libmyshlib.so 2v |-- libmystlib.a -- main.c.0.o 3v
v v v
Programs have no extension on Linux but will have .exe on Windows The .so extension for shared libraries on Linux will be .dll on Windows The .o object les use the original le name and an index to avoid errors in multiple compilations
The build context methods program, shlib, stlib and objects return a single task generator with the appropriate features detected from the source list. For example, for a program having .c les in the source attribute, the features added will be "c cprogram", for a d static library, "d dstlib".
9.1.2
Additional attributes
The methods described previously can process many more attributes than just use. Here is an advanced example:
def options(opt): opt.load(compiler_c) def configure(conf): conf.load(compiler_c) def build(bld): bld.program( source target features includes defines lib libpath stlib stlibpath linkflags rpath vnum install_path cflags cxxflags dflags )
= main.c, 1v = appname, 2v = [more, features], = [.], 4v = [LINUX=1, BIDULE], = = = = = = = = = = = [m], 5v [/usr/lib], [dl], 6v [/usr/local/lib], [-g], 7v [/opt/kde/lib] 8v 1.2.3, ${SOME_PATH}/bin, 9v v [-O2, -Wall], 10 [-O3], [-g],
Source le list Target, converted automatically to target.exe or libtarget.so, depending on the platform and type Additional features to add (for a program consisting in c les, the default will be c cprogram) Includes and denes Shared libraries and shared libraries link paths Static libraries and link paths Use linkags for specic link ags (not for passing libraries) rpath and vnum, ignored on platforms that do not support them Programs and shared libraries are installed by default. To disable the installation, set None. Miscalleneous ags, applied to the source les that support them (if present)
v v v v v v v v v
10
9.2
9.2.1
Include processing
Execution path and ags
Include paths are used by the C/C++ compilers for nding headers. When one header changes, the les are recompiled automatically. For example on a project having the following structure:
$ tree . |-- foo.h |-- src | |-- main.c | -- wscript -- wscript
The command-line (output by waf -v) will have the following form:
cc -I. -I.. -Isrc -I../src ../src/main.c -c -o src/main_1.o
Because commands are executed from the build directory, the folders have been converted to include ags in the following way:
.. -> -I.. . -> -I../src -I. -Isrc
There are the important points to remember: 1. The includes are always given relative to the directory containing the wscript le 2. The includes add both the source directory and the corresponding build directory for the task generator variant 3. Commands are executed from the build directory, so the include paths must be converted 4. System include paths should be dened during the conguration and added to INCLUDES variables (uselib)
9.2.2
Waf uses a preprocessor written in Python for adding the dependencies on the headers. A simple parser looking at #include statements would miss constructs such as:
#define mymacro "foo.h" #include mymacro
Using the compiler for nding the dependencies would not work for applications requiring le preprocessing such as Qt. For Qt, special include les having the .moc extension must be detected by the build system and produced ahead of time. The c compiler could not parse such les.
#include "foo.moc"
Since system headers are not tracked by default, the waf preprocessor may miss dependencies written in the following form:
#if SOMEMACRO /* an include in the project */ #include "foo.h" #endif
To write portable code and to ease debugging, it is strongly recommended to put all the conditions used within a project into a cong.h le.
def configure(conf): conf.check( fragment = int main() { return 0; }\n, define_name = FOO, mandatory = True) conf.write_config_header(config.h)
For performance reasons, the implicit dependency on the system headers is ignored by default. The following code may be used to enable this behaviour:
from waflib import c_preproc c_preproc.go_absolute = True
Additional tools such as gccdeps or dumbpreproc provide alternate dependency scanners that can be faster in certain cases (boost).
Note The Waf engine will detect if tasks generate headers necessary for the compilation and compute the build order accordingly. It may sometimes improve the performance of the scanner if the tasks creating headers provide the hint ext_out=[".h"].
9.2.3
Dependency debugging
The dependency computation is performed only when the les are not up-to-date, so these commands will display something only when there is a le to compile.
Note The scanner is only called when C les or dependencies change. In the rare case of adding headers after a successful compilation, then it may be necessary to run waf clean build to force a full scanning.
9.3
9.3.1
The attribute use enables the link against libraries (static or shared), or the inclusion of object les when the task generator referenced is not a library.
def build(bld): bld.stlib( source target name bld.program( source target includes use
= = = =
The name attribute must point at exactly one task generator The attribute use contains the task generator names to use
In this example, the le app will be re-created whenever mylib changes (order and dependency). By using task generator names, the programs and libraries declarations may appear in any order and across scripts. For convenience, the name does not have to be dened, and will be pre-set from the target name:
def build(bld): bld.stlib( source target bld.program( source target includes use
= test_staticlib.c, = mylib)
= = = =
The use processing also exhibits a recursive behaviour. Lets illustrate it by the following example:
def build(bld): bld.shlib( source = a.c, 1v target = lib1)
bld.shlib( source = c.c, target = lib3, use = lib1 lib2) bld.program( 4v source = main.c, target = app, use = lib3)
A simple shared library The cshlib ags will be propagated to both the library and the program. lib3 uses both a shared library and a static library A program using lib3
1
v v
Because of the shared library dependency lib1 lib2, the program app should link against both lib1 and lib3, but not against lib2:
$ waf -v clean finished successfully (0.004s) Waf: Entering directory /tmp/cprog_propagation/build [1/8] c: a.c -> build/a.c.0.o 12:36:17 runner [/usr/bin/gcc, -fPIC, ../a.c, -c, -o, a.c.0.o] [2/8] c: b.c -> build/b.c.1.o 12:36:17 runner [/usr/bin/gcc, ../b.c, -c, -o, b.c.1.o] [3/8] c: c.c -> build/c.c.2.o 12:36:17 runner [/usr/bin/gcc, -fPIC, ../c.c, -c, -o, c.c.2.o] [4/8] c: main.c -> build/main.c.3.o 12:36:17 runner [/usr/bin/gcc, ../main.c, -c, -o, main.c.3.o] [5/8] cstlib: build/b.c.1.o -> build/liblib2.a 12:36:17 runner [/usr/bin/ar, rcs, liblib2.a, b.c.1.o] [6/8] cshlib: build/a.c.0.o -> build/liblib1.so 12:36:17 runner [/usr/bin/gcc, a.c.0.o, -o, liblib1.so, -shared] [7/8] cshlib: build/c.c.2.o -> build/liblib3.so 12:36:17 runner [/usr/bin/gcc, c.c.2.o, -o, liblib3.so, -Wl,-Bstatic, -L., - llib2, -Wl,-Bdynamic, -L., -llib1, -shared] [8/8] cprogram: build/main.c.3.o -> build/app 12:36:17 runner [/usr/bin/gcc, main.c.3.o, -o, app, -Wl,-Bdynamic, -L., -llib1 , -llib3] Waf: Leaving directory /tmp/cprog_propagation/build build finished successfully (0.144s)
To sum up the two most important aspects of the use attribute: 1. The task generators may be created in any order and in different les, but must provide a unique name for the use attribute 2. The use processing will iterate recursively over all the task generators involved, but the ags added depend on the target kind (shared/static libraries)
1 To
9.3.2
9.3.2.1
The use keywork may point at special libraries that do not actually declare a target. For example, header-only libraries are commonly used to add specic include paths to several targets:
def build(bld): bld( includes = . src, export_includes = src, 1v name = com_includes) bld.stlib( source target use bld.program( source target use )
v v v
The includes attribute is private, but export_includes will be used by other task generators The paths added are relative to the other task generator The export_includes will be propagated to other task generators
Object les
9.3.2.2
= = = =
Files will be compiled in c mode, but no program or library will be produced Different compilation ags may be used The objects will be added automatically in the link stage There is no object propagation to other programs or libraries to avoid duplicate symbol errors
v v v
Warning Like static libraries, object les are often abused to copy-paste binary code. Try to minimize the executables size by using shared libraries whenever possible.
9.3.2.3
Fake libraries
Local libraries will trigger a recompilation whenever they change. The methods read_shlib and read_stlib can be used to add this behaviour to external libraries or to binary les present in the project.
def build(bld): bld.read_shlib(m, paths=[., /usr/lib64]) bld.program(source=main.c, target=app, use=m)
The methods will try to nd les such as libm.so or libm.dll in the specied paths to compute the required paths and dependencies. In this example, the target app will be re-created whenever /usr/lib64/libm.so changes. These libraries are propagated between task generators just like shared or static libraries declared locally.
9.3.3
When an element in the attribute use does not match a local library, it is assumed that it represents a system library, and the the required ags are present in the conguration set env. This system enables the addition of several compilation and link ags at once, as in the following example:
import sys def options(opt): opt.load(compiler_c) def configure(conf): conf.load(compiler_c) conf.env.INCLUDES_TEST
= [/usr/include]
if sys.platform != win32: 2v conf.env.DEFINES_TEST = [TEST] conf.env.CFLAGS_TEST = [-O0] 3v conf.env.LIB_TEST = [m] conf.env.LIBPATH_TEST = [/usr/lib] conf.env.LINKFLAGS_TEST = [-g] conf.env.INCLUDES_TEST = [/opt/gnome/include] def build(bld): mylib = bld.stlib( source = test_staticlib.c, target = teststaticlib, use = TEST) 4v if mylib.env.CC_NAME == gcc: mylib.cxxflags = [-O2]
For portability reasons, it is recommended to use INCLUDES instead of giving ags of the form -I/include. Note that the INCLUDES use used by both c and c++ Variables may be left undened in platform-specic settings, yet the build scripts will remain identical. Declare a few variables during the conguration, the variables follow the convention VAR_NAME Add all the VAR_NAME corresponding to the use variable NAME, which is TEST in this example
v v v
Model to avoid: setting the ags and checking for the conguration should be performed in the conguration section
The variables used for C/C++ are the following: Table 9.1: Use variables and task generator attributes for C/C++ Uselib variable LIB LIBPATH STLIB STLIBPATH LINKFLAGS RPATH CFLAGS CXXFLAGS DFLAGS INCLUDES CXXDEPS CCDEPS LINKDEPS DEFINES FRAMEWORK FRAMEWORKPATH ARCH Attribute lib libpath stlib stlibpath linkags rpath cags cxxags dags includes Usage list of sharedlibrary names to use, without prex or extension list of search path for shared libraries list of static library names to use, without prex or extension list of search path for static libraries list of link ags (use other variables whenever possible) list of paths to hard-code into the binary during linking time list of compilation ags for c les list of compilation ags for c++ les list of compilation ags for d les include paths a variable/list to trigger c++ le recompilations when it changes same as above, for c same as above, for the link tasks list of denes in the form [key=value, . . . ] list of frameworks to use list of framework paths to use list of architectures in the form [ppc, x86]
The variables may be left empty for later use, and will not cause errors. During the development, the conguration cache les (for example, _cache.py) may be modied from a text editor to try different congurations without forcing a whole project reconguration. The les affected will be rebuilt however.
9.4
9.4.1
Conguration helpers
Conguration tests
The method check is used to detect parameters using a small build project. The main parameters are the following 1. msg: title of the test to execute 2. okmsg: message to display when the test succeeds 3. errmsg: message to display when the test fails 4. env: environment to use for the build (conf.env is used by default) 5. compile_mode: cc or cxx 6. dene_name: add a dene for the conguration header when the test succeeds (in most cases it is calculated automatically) The errors raised are instances of waib.Errors.CongurationError. There are no return codes. Besides the main parameters, the attributes from c/c++ task generators may be used. Here is a concrete example:
def configure(conf): conf.check(header_name=time.h, features=c cprogram) 1v conf.check_cc(function_name=printf, header_name="stdio.h", mandatory=False) conf.check_cc(fragment=int main() {2+2==4;}\n, define_name="boobah") 3v conf.check_cc(lib=m, cflags=-Wall, defines=[var=foo, x=y], uselib_store=M) 4v conf.check_cxx(lib=linux, use=M, cxxflags=-O2) 5v conf.check_cc(fragment= #include <stdio.h> int main() { printf("4"); return 0; } , define_name = "booeah", execute = True, define_ret = True, msg = "Checking for something") 6v conf.check(features=c, fragment=int main(){return 0;}) conf.write_config_header(config.h)
8 7
Try to compile a program using the conguration header time.h, if present on the system, if the test is successful, the dene HAVE_TIME_H will be added Try to compile a program with the function printf, adding the header stdio.h (the header_name may be a list of additional headers). All conguration tests are required by default (@conf methods) and will raise conguration exceptions. To conceal them, set the attribute mandatory to False. Try to compile a piece of code, and if the test is successful, dene the name boobah Modications made to the task generator environment are not stored. When the test is successful and when the attribute uselib_store is provided, the names lib, cags and denes will be converted into use variables LIB_M, CFLAGS_M and DEFINES_M and the ag values are added to the conguration environment. Try to compile a simple c program against a library called linux, and reuse the previous parameters for libm by use Execute a simple program, collect the output, and put it in a dene when successful The tests create a build with a single task generator. By passing the features attribute directly it is possible to disable the compilation or to create more complicated conguration tests. After all the tests are executed, write a conguration header in the build directory (optional). The conguration header is used to limit the size of the command-line.
v v
v v v
#endif /* _CONFIG_H_WAF */
DEFINES_M = [var=foo, x=y] CXXFLAGS_M = [-Wall] CFLAGS_M = [-Wall] LIB_M = [m] boobah = 1 booeah = 4 defines = {booeah: "4", boobah: 1, HAVE_TIME_H: 1, HAVE_PRINTF: 1} dep_files = [config.h] waf_config_files = [/compilation/waf/demos/adv/build/config.h]
9.4.2
Advanced tests
The methods conf.check create a build context and a task generator internally. This means that the attributes includes, denes, cxxags may be used (not all shown here). Advanced tests may be created by passing feature arguments:
from waflib.TaskGen import feature, before_method @feature(special_test) @before_method(process_source) def my_special_test(self): self.bld(rule=touch ${TGT}, target=foo) 1v self.bld(rule=cp ${SRC} ${TGT}, source=foo, target=bar) self.source = [] 2v def configure(conf): conf.check_cc(features=special_test, msg=my test!)
Create a task generator from another task generator Disable the compilation of test.c by setting no source les Use the feature special_test
9.4.3
Adding lots of command-line dene values increases the size of the command-line and conceals the useful information (differences). Some projects use headers which are generated during the conguration, they are not modied during the build and they are not installed or redistributed. This system is useful for huge projects, and has been made popular by autoconf-based projects. Writing conguration headers can be performed using the following methods:
def configure(conf): conf.define(NOLIBF, 1) conf.undefine(NOLIBF) conf.define(LIBF, 1) conf.define(LIBF_VERSION, 1.0.2) conf.write_config_header(config.h)
The code snipped will produce the following cong.h in the build directory:
build/ |-- c4che | |-- build.config.py | -- _cache.py |-- config.log -- config.h
/* Configuration header created by Waf - do not edit */ #ifndef _CONFIG_H_WAF #define _CONFIG_H_WAF /* #undef NOLIBF */ #define LIBF 1 #define LIBF_VERSION "1.0.2" #endif /* _CONFIG_H_WAF */
Note By default, the denes are moved from the command-line into the conguration header. This means that the attribute conf.env.DEFINE is cleared by this operation. To prevent this behaviour, use conf.write_cong_header(remove=False)
9.4.4
Pkg-cong
Instead of duplicating the conguration detection in all dependent projects, conguration les may be written when libraries are installed. To ease the interaction with build systems based on Make (cannot query databases or apis), small applications have been created for reading the cache les and to interpret the parameters (with names traditionally ending in -cong): pkg-cong, wx-cong, sdl-cong, etc. The method check_cfg is provided to ease the interaction with these applications. Here are a few examples:
def options(opt): opt.load(compiler_c) def configure(conf): conf.load(compiler_c) conf.check_cfg(atleast_pkgconfig_version=0.0.0) 1v pango_version = conf.check_cfg(modversion=pango) 2v conf.check_cfg(package=pango) 3v conf.check_cfg(package=pango, uselib_store=MYPANGO, args=[--cflags, --libs]) 4v conf.check_cfg(package=pango, 5v args=[pango >= 0.1.0, pango < 9.9.9, --cflags, --libs], msg="Checking for pango 0.1.0") 6v conf.check_cfg(path=sdl-config, args=--cflags --libs, package=, uselib_store=SDL) 7v conf.check_cfg(path=mpicc, args=--showme:compile --showme:link, package=, uselib_store=OPEN_MPI, mandatory=False) 8v
Check for the pkg-cong version Retrieve the module version for a package as a string. If there were no errors, PANGO_VERSION is dened. It can be overridden with the attribute uselib_store=MYPANGO. Check if the pango package is present, and dene HAVE_PANGO (calculated automatically from the package name) Beside dening HAVE_MYPANGO, extract and store the relevant ags to the use variable MYPANGO (LIB_MYPANGO, LIBPATH_MYPANGO, etc) Like the previous test, but with pkg-cong clauses to enforce a particular version number
v v
v v
Display a custom message on the output. The attributes okmsg and errmsg represent the messages to display in case of success and error respectively Obtain the ags for sdl-cong. The example is applicable for other conguration programs such as wx-cong, pcre-cong, etc Suppress the conguration error which is raised whenever the program to execute is not found or returns a non-zero exit status
v v
Due to the amount of ags, the lack of standards between cong applications, and to the compiler-dependent ags (-I for gcc, /I for msvc), the pkg-cong output is parsed before setting the corresponding use variables in a go. The function parse_ags(line, uselib, env) in the Waf module c_cong.py performs the ag extraction. The outputs are written in the build directory into the le cong.log:
# project configured on Tue Aug 31 17:30:21 2010 by # waf 1.6.8 (abi 98, python 20605f0 on linux2) # using /home/waf/bin/waf configure # --Setting top to /disk/comp/waf/docs/book/examples/cprog_pkgconfig --Setting out to /disk/comp/waf/docs/book/examples/cprog_pkgconfig/build --Checking for program pkg-config /usr/bin/pkg-config find program=[pkg-config] paths=[/usr/local/bin, /usr/bin] var=PKGCONFIG -> /usr/ bin/pkg-config --Checking for pkg-config version >= 0.0.0 [/usr/bin/pkg-config, --atleast-pkgconfig-version=0.0.0] yes [/usr/bin/pkg-config, --modversion, pango] out: 1.28.0 --Checking for pango [/usr/bin/pkg-config, pango] yes --Checking for pango [/usr/bin/pkg-config, pango] yes --Checking for pango 0.1.0 [/usr/bin/pkg-config, pango >= 0.1.0, pango < 9.9.9, --cflags, --libs, pango] out: -pthread -I/usr/include/pango-1.0 -I/usr/include/glib-2.0 -I/usr/lib64/glib-2.0/ include -pthread -lpango-1.0 -lgobject-2.0 -lgmodule-2.0 -lgthread-2.0 -lrt -lglib-2.0 yes --Checking for sdl-config [sdl-config, --cflags, --libs] out: -I/usr/include/SDL -D_GNU_SOURCE=1 -D_REENTRANT -L/usr/lib64 -lSDL -lpthread yes --Checking for mpicc
[mpicc, --showme:compile, --showme:link] out: -pthread libtool: link: -pthread -L/usr/lib64 -llammpio -llamf77mpi -lmpi -llam -lutil -ldl
After such a conguration, the conguration set contents will be similar to the following:
CFLAGS_OPEN_MPI [-pthread] CFLAGS_PANGO [-pthread] CXXFLAGS_OPEN_MPI [-pthread] CXXFLAGS_PANGO [-pthread] DEFINES [HAVE_PANGO=1, HAVE_MYPANGO=1, HAVE_SDL=1, HAVE_OPEN_MPI=1] DEFINES_SDL [_GNU_SOURCE=1, _REENTRANT] INCLUDES_PANGO [/usr/include/pango-1.0, /usr/include/glib-2.0, /usr/lib64/glib-2.0/ include] INCLUDES_SDL [/usr/include/SDL] LIBPATH_OPEN_MPI [/usr/lib64] LIBPATH_SDL [/usr/lib64] LIB_OPEN_MPI [lammpio, lamf77mpi, mpi, lam, util, dl] LIB_PANGO [pango-1.0, gobject-2.0, gmodule-2.0, gthread-2.0, rt, glib-2.0] LIB_SDL [SDL, pthread] LINKFLAGS_OPEN_MPI [-pthread] LINKFLAGS_PANGO [-pthread] PKGCONFIG /usr/bin/pkg-config PREFIX /usr/local define_key [HAVE_PANGO, HAVE_MYPANGO, HAVE_SDL, HAVE_OPEN_MPI]
Chapter 10
Advanced scenarios
This chapter demonstrates a few examples of the waf library for more complicated and less common scenarios.
10.1
10.1.1
Project organization
Building the compiler rst
The example below demonstrates how to build a compiler which is used for building the remaining targets. The requirements are the following: 1. Create the compiler and all its intermediate tasks 2. Re-use the compiler in a second build step 3. The compiler will transform .src les into .cpp les, which will be processed too 4. Call the compiler again if it was rebuilt (add the dependency on the compiler) The rst thing to do is to write the expected user script:
top = . out = build def configure(ctx): ctx.load(g++) ctx.load(src2cpp, tooldir=.) def build(ctx): ctx.program( 1v source = comp.cpp, target = comp) ctx.add_group()
2
v v
Build the compiler rst, it will result in a binary named comp Add a new build group to make certain the compiler is complete before processing the next tasks
The code for the src cpp conversion will be the following:
from waflib.Task import Task class src2cpp(Task): 1v run_str = ${SRC[0].abspath()} ${SRC[1].abspath()} ${TGT} color = PINK from waflib.TaskGen import extension @extension(.src) def process_src(self, node): 2v tg = self.bld.get_tgen_by_name(comp) 3v comp = tg.link_task.outputs[0] tsk = self.create_task(src2cpp, [comp, node], node.change_ext(.cpp)) self.source.extend(tsk.outputs) 5v
Declare a new task class for processing the source le by our compiler Files of extension .src are to be processed by this method Obtain a reference on the task generator producing the compiler Create the task src cpp, the compiler being as used as the rst source le Add the generated cpp le to be processed too
v 2
3
v v
v 5
Creation of the comp program Use the compiler to generate a.cpp Compile and link a.cpp and main.cpp into the program foo
v v
Note When waf --targets=foo is called, the task generator comp will create its tasks too (task generators from previous groups are processed).
10.1.2
A le is copied into the build directory before the build starts. The build may use this le for building other targets.
cfg_file = somedir/foo.txt def configure(conf): orig = conf.root.find_node(/etc/fstab) txt = orig.read() 1v dest = conf.bldnode.make_node(cfg_file) dest.parent.mkdir() 2v dest.write(txt) 3v conf.env.append_value(cfg_files, dest.abspath())
4
Read the le /etc/fstab Create the destination directory in case it does not already exist Create a new le in the build directory Mark the output as a conguration le so it can be used during the build
v v v
10.2
10.2.1
Now lets illustrate the @extension decorator on idl le processing. Files with .idl extension are processed to produce .c and .h les (foo.idl foo.c + foo.h). The .c les must be compiled after being generated. First, here is the declaration expected in user scripts:
top = . out = build def configure(conf): conf.load(g++) def build(bld): bld.program( source = foo.idl main.cpp, target = myapp )
The le foo.idl is listed as a source. It will be processed to foo.cpp and compiled and linked with main.cpp Here is the code to support this scenario:
from waflib.Task import Task from waflib.TaskGen import extension class idl(Task): run_str = cp ${SRC} ${TGT[0].abspath()} && touch ${TGT[1].abspath()} color = BLUE ext_out = [.h] 2v @extension(.idl) def process_idl(self, node): cpp_node = node.change_ext(.cpp) hpp_node = node.change_ext(.hpp) self.create_task(idl, node, [cpp_node, hpp_node]) self.source.append(cpp_node) 4v
Dummy command for demonstration purposes. In practice the rule to use would be like omniidl -bcxx ${SRC} -C${TGT} Because the idl task produces headers, it must be executed before any other cpp le is compiled Create the task from the .idl extension. Reinject the le to compile by the C++ compiler
v v v
Note The drawback of this declaration is that the source les produced by the idl transformation can be used by only one task generator.
10.2.2
Lets suppose now that the idl outputs will be shared by several task generators. We will rst start by writing the expected user script:
top = . out = out def configure(ctx): ctx.load(g++) def build(ctx): ctx( 1v source name ctx.program( source target includes add_idl
= notify.idl, = idl_gen)
2
v
[main.cpp], testprog, ., idl_gen) 3v
= = = =
Process an idl le in a rst task generator. Name this task generator idl_gen Somewhere else (maybe in another script), another task generator will use the source generated by the idl processing Reference the idl processing task generator by the name idl_gen.
v v
The idl processing must be performed before any C++ task is executed Bind the output le to a new attribute Add the source from another task generator object Process add_idl, nding the other task generator Ensure that the other task generator has created its tasks Update the source list
v v v v v
The task execution output will be very similar to the output from the rst example:
$ waf distclean configure build -v distclean finished successfully (0.007s) Setting top to : /tmp/scenarios_idl2 Setting out to : /tmp/scenarios_idl2/build Checking for program g++,c++ : /usr/bin/g++ Checking for program ar : /usr/bin/ar configure finished successfully (0.080s) Waf: Entering directory /tmp/scenarios_idl2/build [1/4] idl: foo.idl -> build/foo.cpp build/foo.hpp 20:20:24 runner cp ../foo.idl foo.cpp && touch foo.hpp [2/4] cxx: main.cpp -> build/main.cpp.1.o 20:20:24 runner [/usr/bin/g++, -I., -I.., ../main.cpp, -c, -o, main.cpp.1.o] [3/4] cxx: build/foo.cpp -> build/foo.cpp.1.o 20:20:24 runner [/usr/bin/g++, -I., -I.., foo.cpp, -c, -o, foo.cpp.1.o] [4/4] cxxprogram: build/main.cpp.1.o build/foo.cpp.1.o -> build/testprog 20:20:24 runner [/usr/bin/g++, main.cpp.1.o, foo.cpp.1.o, -o, testprog] Waf: Leaving directory /tmp/scenarios_idl2/build build finished successfully (0.130s)
10.3
10.3.1
In general, task generator attributes are not replaced, so the following is not going to be compile main.c:
bld.env.FOO = /usr/includes bld.env.MAIN = main.c bld( features = c cprogram, source = ${MAIN}, target = app, includes = . ${FOO})
This design decision is motivated by two main reasons: 1. Processing the attributes has a negative performance impact 2. For consistency all attributes would have to be processed Nevertheless, it is we will demonstrate how to provide Waf with a method to process some attributes. To add a new task generator method, it is necessary to think about its integration with other methods: is there a particular order? The answer is yes, for example, the source attribute is used to create the compilation tasks. To display what methods are in use, execute Waf with the following logging key:
$ waf --zones=task_gen ... 19:20:51 task_gen posting task_gen app declared in scenarios_expansion 19:20:51 task_gen -> process_rule (9232720) 2v 19:20:51 task_gen -> process_source (9232720) 19:20:51 task_gen -> apply_link (9232720) 19:20:51 task_gen -> apply_objdeps (9232720) 19:20:51 task_gen -> process_use (9232720) 19:20:51 task_gen -> propagate_uselib_vars (9232720) 19:20:51 task_gen -> apply_incpaths (9232720) 19:20:51 task_gen posted app
From the method list, we nd that process_rule and process_source are processing the source attribute. The includes attribute is processed by apply_incpaths.
from waflib import Utils, TaskGen @TaskGen.feature(*) 1v @TaskGen.before(process_source, process_rule, apply_incpaths) 2v def transform_strings(self): for x in includes source.split(): 3v val = getattr(self, x, None) if val: if isinstance(val, str): setattr(self, x, Utils.subst_vars(val, self.env)) 4v elif isinstance(val, list): for i in xrange(len(val)): if isinstance(val[i], str): val[i] = Utils.subst_vars(val[i], self.env)
Execute this method in all task generators Methods to take into account Iterate over all interesting attributes Substitute the attributes
v v v
10.3.2
A scenario that appears from times to times in C/C++ projects is the need to insert specic ags before others, regardless of how ags are usually processed. We will now consider the following case: execute all C++ compilations with the ag -I. in rst position (before any other include). First, a look at the denition of the C++ compilation rule shows that the variable INCPATHS contains the include ags:
class cxx(Task.Task): color = GREEN run_str = ${CXX} ${CXXFLAGS} ${CPPPATH_ST:INCPATHS} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${ TGT} vars = [CXXDEPS] ext_in = [.h] scan = c_preproc.scan
Those include ags are set by the method apply_incpaths. The trick is then to modify INCPATHS after that method has been executed:
top = . out = build def configure(conf): conf.load(g++) def build(bld): bld.program(features=cxx cxxprogram, source=main.cpp, target=test) from waflib.TaskGen import after, feature @feature(cxx) @after_method(apply_incpaths) def insert_blddir(self): self.env.prepend_value(INCPATHS, .)
A related case is how to add the top-level directory containing a conguration header:
@feature(cxx) @after_method(apply_incpaths, insert_blddir) def insert_srcdir(self): path = self.bld.srcnode.abspath() self.env.prepend_value(INCPATHS, path)
10.4
10.4.1
Custom tasks
Force the compilation of a particular task
In some applications, it may be interesting to keep track of the date and time of the last build. In C this may be done by using the macros DATE and TIME, for example, the following about.c le will contain:
void ping() { printf("Project compiled: %s %s\n", __DATE__, __TIME__); }
The les are only compiled when they change though, so it is necessary to nd a way to force the about.c recompilation. To sum up, the compilation should be performed whenever: 1. One of the c les of the project is compiled 2. The link ags for any task change 3. The link task including the object for our macro is removed To illustrate this behaviour, we will now set up a project will use various c les:
def options(opt): opt.load(compiler_c) def configure(conf): conf.load(compiler_c) def build(bld): bld.program( source target includes use
= = = =
= test_staticlib.c, = my_static_lib)
The main le will just call the function ping dened about.c to display the date and time:
#include "a.h" int main() { ping(); return 0; }
The task method runnable_status must be overridden to take into account the dependencies:
import os from waflib import Task def runnable_status(self): if self.inputs[0].name == about.c: 1v h = 0 2v for g in self.generator.bld.groups: for tg in g: if isinstance(tg, TaskBase): continue 3v h = hash((self.generator.bld.hash_env_vars(self.generator.env, [LINKFLAGS ]), h)) for tsk in getattr(tg, compiled_tasks, []): # all .c or .cpp compilations if id(tsk) == id(self): continue if not tsk.hasrun: return Task.ASK_LATER h = hash((tsk.signature(), h)) 4v self.env.CCDEPS = h try: os.stat(self.generator.link_task.outputs[0].abspath()) except: return Task.RUN_ME return Task.Task.runnable_status(self) from waflib.Tools.c import c 7v c.runnable_status = runnable_status
6
If the task processes about.c Dene a hash value that the task will depend on (CCDEPS) Iterate over all task generators of the project Hash the link ags and the signatures of all other compilation tasks Make sure to execute the task if it was never executed before Normal behaviour Modify the c task class
v v v v v v
$ waf Waf: Entering directory /tmp/scenarios_end/build [2/5] c: test_staticlib.c -> build/test_staticlib.c.1.o [3/5] cstlib: build/test_staticlib.c.1.o -> build/libmy_static_lib.a [4/5] c: about.c -> build/about.c.0.o [5/5] cprogram: build/main.c.0.o build/about.c.0.o -> build/app Waf: Leaving directory /tmp/scenarios_end/build 1v build finished successfully (0.088s) $ ./build/app Project compiled: Jul 25 2010 14:05:30 $ echo " " >> main.c
2
$ waf Waf: Entering directory /tmp/scenarios_end/build [1/5] c: main.c -> build/main.c.0.o [4/5] c: about.c -> build/about.c.0.o 3v [5/5] cprogram: build/main.c.0.o build/about.c.0.o -> build/app Waf: Leaving directory /tmp/scenarios_end/build build finished successfully (0.101s) $ ./build/app Project compiled: Jul 25 2010 14:05:49
All les are compiled on the rst build The le main.c is modied The build generates about.c again to update the build time string
v v
10.4.2
The requirements for this problem are the following: 1. A compiler creates source les (one .src le several .c les) 2. The source le names to create are known only when the compiler is executed 3. The compiler is slow so it should run only when absolutely necessary 4. Other tasks will depend on the generated les (compile and link the .c les into a program) To do this, the information on the source les must be shared between the build executions.
top = . out = build def configure(conf): conf.load(gcc) conf.load(mytool, tooldir=.) def build(bld): bld.env.COMP = bld.path.find_resource(evil_comp.py).abspath() bld.stlib(source=x.c foo.src, target=astaticlib) 2v
out = Utils.to_list(out) self.outputs = [self.generator.path.find_or_declare(x) for x in out] self.generator.bld.raw_deps[self.uid()] = [self.signature()] + self.outputs self.add_c_tasks(self.outputs) 6v def add_c_tasks(self, lst): self.more_tasks = [] for node in lst: if node.name.endswith(.h): continue tsk = self.generator.create_compiled_task(c, node) self.more_tasks.append(tsk) 7v tsk.env.append_value(INCPATHS, [node.parent.abspath()]) if getattr(self.generator, link_task, None): 8v self.generator.link_task.set_run_after(tsk) self.generator.link_task.inputs.append(tsk.outputs[0]) def runnable_status(self): ret = super(src2c, self).runnable_status() if ret == Task.SKIP_ME: lst = self.generator.bld.raw_deps[self.uid()] if lst[0] != self.signature(): return Task.RUN_ME nodes = lst[1:] for x in nodes: try: os.stat(x.abspath()) except: return Task.RUN_ME nodes = lst[1:] self.set_outputs(nodes) self.add_c_tasks(nodes) return ret
The processing will be delegated to the task Disable the warnings raised when a task has no outputs Make certain the processing will be executed before any task using .h les When the task is executed, collect the process stdout which contains the generated le names Store the output le nodes in a persistent cache Create the tasks to compile the outputs The c tasks will be processed after the current task is done. This does not mean that the c tasks will always be executed. If the task generator of the src le has a link task, set the build order When this task can be skipped, force the dynamic c task creation
v v v v v v v v
Chapter 11
11.1
11.1.1
Execution traces
Logging
The generic ags to add more information to the stack traces or to the messages is -v (verbosity), it is used to display the command-lines executed during a build:
$ waf -v
To display all the traces (useful for bug reports), use the following ag:
$ waf -vvv
The Waf module Logs replaces the Python module logging. In the source code, traces are provided by using the debug function, they must obey the format "zone: message" like in the following:
Logs.debug("task: executing %r - it was never run before or its class changed" % self)
The following zones are used in Waf: Table 11.1: Debugging zones Zone runner deps task_gen action env envhash build preproc group Description command-lines executed (enabled when -v is provided without debugging zones) implicit dependencies found (task scanners) task creation (from task generators) and task generator method execution functions to execute for building the targets environment contents hashes of the environment objects - helps seeing what changes build context operations such as lesystem access preprocessor execution groups and task generators
Warning Debugging information can be displayed only after the command-line has been parsed. For example, no debugging information will be displayed when a waf tool is being by for the command-line options opt.load() or by the global init method function init.tool()
11.1.2
Build visualization
The Waf tool named parallel_debug is used to inject code in Waf modules and to obtain a detailed execution trace. This module is provided in the folder waflib/extras and must be imported in ones project before use:
def options(ctx): ctx.load(parallel_debug, tooldir=.) def configure(ctx): ctx.load(parallel_debug, tooldir=.) def build(ctx): bld(rule=touch ${TGT}, target=foo)
The execution will generate a diagram of the tasks executed during the build in the le pdebug.svg:
The details will be generated in the le pdebug.dat as space-separated values. The le can be processed by other applications such as Gnuplot to obtain other diagrams:
#! /usr/bin/env gnuplot set terminal png set output "output.png" set ylabel "Amount of active threads" set xlabel "Time in seconds" set title "Active threads on a parallel build (waf -j5)" unset label set yrange [-0.1:5.2] set ytic 1 plot pdebug.dat using 3:7 with lines title "" lt 2
0 0 2 4 6 8 10 Time in seconds 12 14 16 18 20
The data le columns are the following: Table 11.2: pdebug le format Column 1 2 3 4 5 6 7 Type int int oat string int int int Description Identier of the thread which has started or nished processing a task Identier of the task processed Event time Type of the task processed Amount of tasks processed Amount of tasks waiting to be processed by the task consumers Amount of active threads
11.2
11.2.1
Proling
Benchmark projects
The script utils/genbench.py is used as a base to create large c-like project les. The habitual use is the following:
$ $ $ $ utils/genbench.py /tmp/build 50 100 15 5 cd /tmp/build waf configure waf -p -j2
The C++ project created will generate 50 libraries from 100 class les for each, each source le having 15 include headers pointing at the same library and 5 headers pointing at other headers randomly chosen. The compilation time may be discarded easily by disabling the actual compilation, for example:
def build(bld): from waflib import Task def touch_func(task): for x in task.outputs: x.write() for x in Task.TaskBase.classes.keys(): cls = Task.TaskBase.classes[x] cls.func = touch_func cls.color = CYAN
11.2.2
Prole traces
Proling information is obtained by calling the module cProle and by injecting specic code. The most interesting methods to prole is waib.Build.BuildContext.compile. The amount of function calls is usually a bottleneck, and reducing it results in noticeable speedups. Here is an example on the method compile:
from waflib.Build import BuildContext def ncomp(self): import cProfile, pstats cProfile.runctx(self.orig_compile(), {}, {self: self}, profi.txt) p = pstats.Stats(profi.txt) p.sort_stats(time).print_stats(45) BuildContext.orig_compile = BuildContext.compile BuildContext.compile = ncomp
Here the output obtained on a benchmark build created as explained in the previous section:
Fri Jul 23 15:11:15 2010 profi.txt
1114979 function calls (1099879 primitive calls) in 5.768 CPU seconds Ordered by: internal time List reduced from 139 to 45 due to restriction 45 ncalls tottime percall cumtime percall filename:lineno(function) 109500 0.523 0.000 1.775 0.000 /comp/waf/waflib/Node.py:615(get_bld_sig) 5000 0.381 0.000 1.631 0.000 /comp/waf/waflib/Task.py:475( compute_sig_implicit_deps) 154550 0.286 0.000 0.286 0.000 {method update of _hashlib.HASH objects} 265350 0.232 0.000 0.232 0.000 {id} 40201/25101 0.228 0.000 0.228 0.000 /comp/waf/waflib/Node.py:319(abspath) 10000 0.223 0.000 0.223 0.000 {open} 20000 0.197 0.000 0.197 0.000 {method read of file objects} 15000 0.193 0.000 0.349 0.000 /comp/waf/waflib/Task.py:270(uid) 10000 0.189 0.000 0.850 0.000 /comp/waf/waflib/Utils.py:96(h_file)
A few known hot spots are present in the library: 1. The persistence implemented by the cPickle module (the cache le to serialize may take a few megabytes) 2. Accessing conguration data from the Environment instances 3. Computing implicit dependencies in general
11.2.3
Optimizations tips
The Waf source code has already been optimized in various ways. In practice, the projects may use additional assumptions to replace certain methods or parameters from its build scripts. For example, if a project is always executed on Windows, then the framework and rpath variables may be removed:
from waflib.Tools.ccroot import USELIB_VARS USELIB_VARS[cprogram] = USELIB_VARS[cxxprogram] = \ set([LIB, STLIB, LIBPATH, STLIBPATH, LINKFLAGS, LINKDEPS])
11.3
11.3.1
Waf programming
Setting up a Waf directory for development
Waf is hosted on Google code, and uses Subversion for source control. To obtain the development copy, use:
$ git clone http://code.google.com/p/waf/ wafdir $ cd wafdir $ ./waf-light --make-waf
To avoid regenerating Waf each time, the environment variable WAFDIR should be used to point at the directory containing waib:
$ export WAFDIR=/path/to/directory/
11.3.2
Specic guidelines
Though Waf is written in Python, additional restrictions apply to the source code: 1. Identation is tab-only, and the maximum line length should be about 200 characters 2. The development code is kept compatible with Python 2.3, to the exception of decorators in the Tools directory. In particular, the Waf binary can be generated using Python 2.3 3. The waib modules must be insulated from the Tools modules to keep the Waf core small and language independent 4. Api compatibility is maintained in the cycle of a minor version (from 1.5.0 to 1.5.n)
Note More code always means more bugs. Whenever possible, unnecessary code must be removed, and the existing code base should be simplied.
Chapter 12
12.1
12.1.1
Waf consists of the following modules which constitute the core library. They are located in the directory waflib/. The modules located under waflib/Tools and waflib/extras are extensions which are not part of the Waf core. Table 12.1: List of core modules Module Build Congure CongSet Context Errors Logs Node Options Runner Scripting TaskGen Task Utils Role Denes the build context classes (build, clean, install, uninstall), which holds the data for one build (paths, conguration data) Contains the conguration context class, which is used for launching conguration tests and writing the conguration settings for the build Contains a dictionary class which supports a lightweight copy scheme and provides persistence services Contains the base class for all waf commands (context parameters of the Waf commands) Exceptions used in the Waf code Loggging system wrapping the calls to the python logging module Contains the le system representation class Provides a custom command-line option processing system based on optparse Contains the task execution system (thread-based producer-consumer) Constitutes the entry point of the Waf application, executes the user commands such as build, conguration and installation Provides the task generator system, and its extension system based on method addition Contains the task class denitions, and factory functions for creating new task classes Contains support functions and classes used by other Waf modules
Not all core modules are required for using Waf as a library. The dependencies between the modules are represented on the following diagram. For example, the module Node requires both modules Utils and Errors. Conversely, if the module Build is used alone, then the modules Scripting and Congure can be removed safely.
Scripting
Configure
Build
Options
Runner
TaskGen
Context
Task
ConfigSet
Node
Logs
Utils
12.1.2
Context classes
User commands, such as congure or build, are represented by classes derived from waib.Context.Context. When a command does not have a class associated, the base class waib.Context.Context is used instead. The method execute is the start point for a context execution, it often calls the method recurse to start reading the user scripts and execute the functions referenced by the fun class attribute. The command is associated to a context class by the class attribute cmd set on the class. Context subclasses are added in waib.Context.classes by the metaclass store_context and loaded through the function waib.Context.create_context. The classes dened last will replace existing commands. As an example, the following context class will dene or override the congure command. When calling waf congure, the function foo will be called from wscript les:
from waflib.Context import Context class somename(Context): cmd = configure fun = foo
12.1.3
Build classes
The class waib.Build.BuildContext and its subclasses such as waib.Build.InstallContext or waib.Build.StepContext have task generators created when reading the user scripts. The task generators will usually have task instances, depending on the operations performed after all task generators have been processed. The CongSet instances are copied from the build context to the tasks (waib.CongSet.CongSet.derive) to propagate values such as conguration ags. A copy-on-write is performed through most methods of that class (append_value, prepend_value, append_unique). The Parallel object encapsulates the iteration over all tasks of the build context, and delegates the execution to thread objects (producer-consumer). The overall structure is represented on the following diagram:
12.2
12.2.1
Context objects
Context commands and recursion
The context commands are designed to be as independent as possible, and may be executed concurrently. The main application is the execution of small builds as part of conguration tests. For example, the method waib.Tools.c_cong.run_c_code creates a private build context internally to perform the tests. Here is an example of a build that creates and executes simple conguration contexts concurrently:
import os from waflib.Configure import conf, ConfigurationContext from waflib import Task, Build, Logs def options(ctx): ctx.load(compiler_c) def configure(ctx): ctx.load(compiler_c) def build(ctx): ctx(rule=run_test, always=True, header_name=stdio.h) 1v ctx(rule=run_test, always=True, header_name=unistd.h) def run_test(self): top = self.generator.bld.srcnode.abspath() out = self.generator.bld.bldnode.abspath() ctx = ConfigurationContext(top_dir=top, out_dir=out) ctx.init_dirs() 3v ctx.in_msg = 1 4v ctx.msg(test) 5v
2
header = self.generator.header_name logfile = self.generator.path.get_bld().abspath() + os.sep \ + header + .log ctx.logger = Logs.make_logger(logfile, header) 6v ctx.env = self.env.derive() 7v ctx.check(header_name=header) 8v
v v
Create task generators which will run the method run_test method dened below Create a new conguration context as part of a Task.run call Initialize ctx.srcnode and ctx.bldnode (build and conguration contexts only) Set the internal counter for the context methods msg, start_msg and end_msg The console output is disabled (non-zero counter value to disable nested messages) Each context may have a logger to redirect the error messages Initialize the default environment to a copy of the task one Perform a conguration check
v v v v v v
After executing waf build, the project folder will contain the new log les:
$ tree . |-- build | |-- c4che | | |-- build.config.py | | -- _cache.py | |-- config.log | |-- stdio.h.log | -- unistd.h.log -- wscript
A few measures are set to ensure that the contexts can be executed concurrently: 1. Context objects may use different loggers derived from the waib.Logs module. 2. Each context object is associated to a private subclass of waib.Node.Node to ensure that the node objects are unique. To pickle Node objects, it is important to prevent concurrent access by using the lock object waib.Node.pickle_lock.
12.2.2
The build context holds all the information necessary for a build. To accelerate the start-up, a part of the information is stored and loaded between the runs. The persistent attributes are the following: Table 12.2: Persistent attributes Attribute root node_deps raw_deps task_sigs Description Node representing the root of the le system Implicit dependencies Implicit le dependencies which could not be resolved Signature of the tasks executed Type Node dict mapping Node to signatures dict mapping Node ids to any serializable type dict mapping a Task computed uid to a hash
12.3
12.3.1
The tool waib.Tools.ccroot provides a system for creating object les and linking them into a single nal le. The method waib.Tools.ccroot.apply_link is called after the method waib.TaskGen.process_source to create the link task. In pseudocode:
call the method process_source: for each source file foo.ext: process the file by extension if the method create_compiled_task is used: create a new task set the output file name to be foo.ext.o add the task to the list self.compiled_tasks call the method apply_link for each name N in self.features: find a class named N: if the class N derives from waflib.Tools.ccroot.link_task: create a task of that class, assign it to self.link_task set the link_task inputs from self.compiled_tasks set the link_task output name to be env.N_PATTERN % self.target stop
This system is used for assembly, C, C++, D and fortran by default. Note that the method apply_link is supposed to be called after the method process_source. We will now demonstrate how to support the following mini language:
cp: .ext -> .o cat: *.o -> .exe
This import will bind the methods such as create_compiled_task and apply_link_task An alternate denition would be calling waib.TaskGen.feats[mylink] = [apply_link] The link task must be a subclass of another link task class Calling the method create_compiled_task
v v v
Note Task generator instances have at most one link task instance
12.4
12.4.1
12.4.1.1
The intent of the Waf tools is to promote high cohesion by moving all conceptually related methods and classes into separate les, hidden from the Waf core, and as independent from each other as possible. Custom Waf tools can be left in the projects, added to a custom waf le through the waib/extras folder, or used through sys.path changes. The tools can import other tools directly through the import keyword. The scripts however should always import the tools to the ctx.load to limit the coupling. Compare for example:
def configure(ctx): from waflib.extras.foo import method1 method1(ctx)
and:
def configure(ctx): ctx.load(foo) ctx.method1()
The second version should be preferred, as it makes fewer assumptions on whether method1 comes from the module foo or not, and on where the module foo is located.
12.4.1.2
The tools compiler_c, compiler_cxx and compiler_fc use other waf tools to detect the presense of particular compilers. They provide a particular naming convention to give a chance to new tools to register themselves automatically and save the import in user scripts. The tools having names beginning by c_, cxx_ and fc_ will be tested. The registration code will be similar to the following:
from waflib.Tools.compiler_X import X_compiler X_compiler[platform].append(module_name)
where X represents the type of compiler (c, cxx or fc), platform is the platform on which the detection should take place (linux, win32, etc), and module_name is the name of the tool to use.
12.4.2
12.4.2.1
Command methods
Subclassing is only for commands
As a general rule, subclasses of waib.Context.Context are created only when a new user command is necessary. This is the case for example when a command for a specic variant (output folder) is required, or to provide a new behaviour. When this happens, the class methods recurse, execute or the class attributes cmd, fun are usually overridden.
Note If there is no new command needed, do not use subclassing.
12.4.2.2
Although the Waf framework promotes the most exible way of declaring tasks through task generators, it is often more convenient to declare domain-specic wrappers in large projects. For example, the samba project provides a function used as:
bld.SAMBA_SUBSYSTEM(NDR_NBT_BUF, source = nbtname.c, deps = talloc, autoproto = nbtname.h )
12.4.2.3
New methods are commonly bound to the build context or to the conguration context by using the @conf decorator:
from waflib.Configure import conf @conf def enterprise_program(self, *k, **kw): kw[features] = c cprogram debug_tasks return self(*k, **kw) def build(bld): # no feature line bld.enterprise_program(source=main.c, target=app)
The methods should always be bound in this manner or manually, as subclassing may create conicts between tools written for different purposes.
Chapter 13
Further reading
Due to the amount of features provided by Waf, this book cannot be both complete and up-to-date. For greater understanding and practice the following links are recommended to the reader: Table 13.1: Recommended links Link http://docs.waf.googlecode.com/git/apidocs_16/index.html http://code.google.com/p/waf http://code.google.com/p/waf/w/list http://groups.google.com/group/waf-users http://waf-devel.blogspot.com/2011/01/python-32-andbuild-system-kit.html Description The apidocs The Waf project page The Waf wiki, including the frequently asked questions (FAQ) The Waf mailing-list Information on the build system kit
Chapter 14
Glossary
Build Order The build order is the sequence in which tasks must be executed. Because tasks can be executed in parallel, several build orders can be computed depending on the constraints between the tasks. When a build order cannot be computed (usually by contradictory order constraints), the build is said to be in a deadlock. Dependency A dependency represents the conditions by which a task can be considered up-to-date or not (execution status). The dependencies can be explicit (le inputs and outputs) or abstract (dependency on a value for example). Task generator A task generator is an object instance of the class Task.task_gen. The task generators encapsulate the creation of various task instances at a time, and simplify the creation of ordering constraints between them (for example, compilation tasks are executed before link tasks). Task A Waf task is an object instance of the class Task.TaskBase. Waf tasks may be simple (Task.TaskBase) or related to the lesystem (Task.Task). Tasks represent the production of something during the build (les in general), and may be executed in sequence (with ordering constraints) or in parallel. Tool A Waf tool is a python module containing Waf-specic extensions. The Waf tools are located in the folder waflib/Tools/ and usually contain a global variable congure which may reference functions to execute in the conguration. Node The Node class is a data structure used to represent the lesystem in an efcient manner. The node objects may represent les or folders. File nodes are associated to signatures objects. The signature can be hashes of the le contents (source les) or task signatures (build les). Command Function present in the top-level project le (wscript) and accepting a waib.Context.Context instance as unique input parameter. The function is executed when its name is given on the command-line (for example running waf congure will execute the function congure) Variant Additional output directory used to enable several (build) commands to create the same targets with different compilation ags.