diff --git a/doc/cross_module_dependencies.html b/doc/cross_module_dependencies.html new file mode 100644 index 00000000..3d416bf5 --- /dev/null +++ b/doc/cross_module_dependencies.html @@ -0,0 +1,253 @@ + +
+
+
+
++At first sight this might seem natural and straightforward. However, it +is a fairly complex problem to establish cross-extension-module +dependencies while maintaining the same ease of use Boost.Python +provides for classes that are wrapped in the same extension module. To +a large extent this complexity can be hidden from the author of a +Boost.Python extension module, but not entirely. + +
+import std_vector +v = std_vector.double([1, 2, 3, 4]) +v.push_back(5) +v.size() ++ +Suppose the std_vector module is done well and reflects all C++ +functions that are useful at the Python level, for all C++ built-in +data types (std_vector.int, std_vector.long, etc.). + +
+Suppose further that there is statistic module with a C++ class that +has constructors or member functions that use or return a std::vector. +For example: + +
+class xy {
+ private:
+ std::vector<double> m_x;
+ std::vector<double> m_y;
+ public:
+ xy(const std::vector<double>& x, const std::vector<double>& y) : m_x(x), m_y(y) {}
+ const std::vector<double>& x() const { return m_x; }
+ const std::vector<double>& y() const { return m_y; }
+ double correlation();
+}
+
+
+What is more natural then reusing the std_vector extension module to
+expose these constructors or functions to Python?
+
++Unfortunately, what seems natural needs a little work in both the +std_vector and the statistics module. + +
+In the std_vector extension module, std::vector<double> needs to be +exposed to Python with the x_class_builder<> template instead of the +regular class_builder<>. For example: + +
+ x_class_builder<std::vector<double> > v_double(std_vector_module, "double"); ++ +In the extension module that wraps class xy we need to use +the import_class_builder<> template: + +
+ import_class_builder<std::vector<double> > v_double("std_vector", "double");
+
+
+That is all. All the properties that are defined for std_vector.double
+in the std_vector Boost.Python module will be available for the
+returned objects of xy.x() and xy.y(). Similarly, the constructor for
+xy will accept objects that were created by the std_vector module.
+
++xptr_class_builder<store> py_store(your_module, "store"); ++ +The corresponding import_class_builder<> does not need any special +attention: + +
+import_class_builder<store> py_store("noncopyable_export", "store");
+
+
++import std_vector +import statistics +x = std_vector.double([1, 2, 3, 4]) +y = std_vector.double([2, 4, 6, 8]) +xy = statistics.xy(x, y) +xy.correlation() ++ +In this example it is clear that Python has to be able to find both the +std_vector and the statistics extension module. In other words, both +extension modules need to be in the Python module search path +(sys.path). + +
+The situation is not always that obvious. Suppose the statistics +module has a random function that returns a vector of random +numbers with a given length: + +
+import statistics +x = statistics.random(5) +y = statistics.random(5) +xy = statistics.xy(x, y) +xy.correlation() ++ +A naive user will not easily anticipate that the std_vector module is +used to pass the x and y vectors around. If the std_vector module is in +the Python module search path, this form of ignorance is of no harm. +On the contrary, we are glad that we do not have to bother the user +with details like this. + +
+If the std_vector module is not in the Python module search path, a +Python exception will be raised: + +
+Traceback (innermost last): + File "foo.py", line 2, in ? + x = statistics.random(5) +ImportError: No module named std_vector ++ +As is the case with any system of a non-trivial complexity, it is +important that the setup is consistent and complete. + +
+Suppose there is a module ivect that implements vectors of integers, +and a similar module dvect that implements vectors of doubles. We want +to be able do convert an integer vector to a double vector and vice +versa. For example: + +
+import ivect +iv = ivect.ivect((1,2,3,4,5)) +dv = iv.as_dvect() ++ +The last expression will implicitly import the dvect module in order to +enable the conversion of the C++ representation of dvect to a Python +object. The analogous is possible for a dvect: + +
+import dvect +dv = dvect.dvect((1,2,3,4,5)) +iv = dv.as_ivect() ++ +Now the ivect module is imported implicitly. + +
+Note that the two-way dependencies are possible because the +dependencies are resolved only when needed. This is, the initialization +of the ivect module does not rely on the dvect module, and vice versa. +Only if as_dvect() or as_ivect() is actually invoked will the +corresponding module be implicitly imported. This also means that, for +example, the dvect module does not have to be available at all if +as_dvect() is never used. + +
+If a library is wrapped that consists of both header files and compiled +components (e.g. libdvect.a, dvect.lib, etc.), both the Boost.Python +extension module with the x_class_wrapper<> and the module with the +import_class_wrapper<> need to be linked against the object library. +Ideally one would build a shared library (e.g. libdvect.so, dvect.dll, +etc.). However, this introduces the issue of getting the search path +for the dynamic loading configured correctly. For small libraries it is +therefore often more convenient to ignore the fact that the object +files are loaded into memory more than once. + +
+The main purpose of Boost.Python's support for resolving cross-module +dependencies at runtime is to allow for a modular system layout. With +this support it is straightforward to reflect C++ code organization at +the Python level. Without the cross-module support, a multi-purpose +module like std_vector would be impractical because the entire wrapper +code would somehow have to be duplicated in all extension modules that +use it, making them harder to maintain and harder to build. + +
+Finally, there is an important psychological component. If a group of +classes is lumped together with many others in a huge module, the +authors will have difficulties in being identified with their work. +The situation is much more transparent if the work is represented by +a module with a recognizable name. This is not just a question of +strong egos, but also of getting credit and funding. + +