The c++ version of the tail fit is wrapped to python. I also changed
slightly the code so that when replace_by_fit is true the data is
changed all the way to the end of the Matsubara axis not only on the
interval where the fit was done.
- The issue comes from the fact that the default generated
+= and co by the Python API is the one for immutable types, like int.
- Indeed, in python, for an int :
x=1
id(x)
140266967205832
x+=1
id(x)
140266967205808
- For a mutable type, like a gf, it is necessary to
add explicitly the xxx_inplace_add functions.
- Added :
- the generation of the inplace_xxx functions
- a method in class_ in the wrapper generator that
deduce all += operator from the + operators.
- this assumes that the +=, ... are defined in C++.
- The generation of such operators are optional, with option
with_inplace_operators in the arithmetic flag.
- Also, added the overload g += M and g -= M for
g : GfImfreq, M a complex matrix.
Mainly for legacy Python codes.
- When a type has no python constructor (E.g. parameters)
do not define the xxx_new function.
- Leads to a better error message when trying to construct
such an object in python.
- TODO : check there is no issue with serialization ?
- detect the use module.
- clean the code specific to wrapper generation from clang_parser.
- add support for default arguments for int, double, char.
- TODO : add more complex default arguments
- hdfarchive : transform the key to string with str
because they can be unicode and C wrapper does not convert python unicode strings.
- gf : Correct the scheme for BlockGf : not very clean, to be improved ?
- Given a C++ file, e.g. a class,
it calls libclang to parse the C++, and retrieve from
its AST the necessary info to write a xxx_desc.py file.
- THIS IS WORK IN PROGRESS. There are several corner cases for which we
may want (or not) the script to do better.
- It is not designed to be used automatically, but to to 90 % of the
boring typesetting work...
- The preamble still needs manual choices
- The properties, methods, functions are automatically declared in
the _desc file, in the simplest possible way.
- An option --properties, -p : to transform some simple methods or
get_x, set_x into python properties, not methods.
Cf doc.
- requires clang (tested on 3.4).
- the script is configured by cmake and installed in
INSTALLATION_DIRECTORY/bin, with some other files.
It can only be used for applications, after the lib has been installed.
It is cmake configured, to include automatically the various include
paths configure in the triqs installation, including the triqs install dir
in order to simplify invocation.
- TODO : improve, and test more in real cases.
- C++14 mode not automatic. Was based on compiler version,
but the version of the C++ lib also matters....
Now, there is an explicit USE_CPP14 option [default =OFF]
to set to compile in c++1y mode.
Solve also #89, i.e. the flags is now in TRIQS_CXX_DEFINITIONS and
therefore pass to applications (including ipython magic).
- Python_use_mpi4py is now ON by default.
Rational : it has been used for a while by e.g. Thomas,
and it is necessary on OS X (boost.mpi.python raises a lot of issues).
Hence we put it as default.
- Rename the option Build_Triqs_General_Tools_Test to a simpler
Build_Tests
- Add the possibility to give a function "on the fly"
for the precall and postcall of a python wrapped functoin.
- No change for previous code, it is a simple new feature.
- changed test accordingly. See my_module_desc.py for an example.
- For users : only change is :
H5::H5File in apps. to be replaced by triqs::h5::file, same API.
- using only the C API because :
- it is cleaner, better documented, more examples.
- it is the native hdf5 interface.
- simplify the installation e.g. on mac. Indeed, hdf5 is
usually installed without C++ interface, which is optional.
E.g. EPD et al., brew by default.
Also the infamous mpi+ hdf5_cpp bug, for which we have no clean solution.
- clean the notion of parent of a group. Not needed, better iterate function in C LT API.
- modified doc : no need for C++ bindings any more.
- modified cmake to avoid requiring CPP bindings.
- import arrays in extensions (mako file).
- put import_arrays in converter,
along the lines of our own objects (numpy and triqs uses
the same capsule technique, i.e. the standard technique from python
doc.)
- for all functions, except when GIL option (not implemented) is here
we use a triqs::py_stream, which redirect the stream
to python sys.write
- so that a simple C++ code with a std::cout
print its output in the notebook...
- NB : it only works for the code within the cell.
If it calls another function on the lib which uses cout,
print is not redirected.
Designed for simple tests....
- cerr not (yet) redirected. (useful ?).
- examples split from the rst file using a python script (split_code).
- Final result for the doc is unchanged.
- examples are compiled and tested with the other tests.
- examples' code have been clang-formatted, with triqs style.
- doc compiles much faster, and with the same options as the rest of the
test.
- examples are added as tests, so they are run by make test, as simple C
tests.
- done for the tutorials and the reference.
- autocompile removed (changed into triqs_example directive).
- add triqs_example :
- make a literal include of the source code.
- runs the compiled example
- add, as before, the result to the source code in the doc.
- added the script split_code, used to make the changes automatically,
maybe for later reuse. (in _tools)
- clean the c_name.
- add more refined signature (with c_name optionally in it).
- add some autodoc.
- clean code : move class in nested, remove useless dict call, etc...
- operator2 : move unary - and unit in algebra in general wrapper.
- various name change to make private function start with _, for
autodoc.
- Add to the wrapper generator (add_method) the release_GIL_and_enable_signal option which :
- release the GIL
- save the python signal handler
- enable the C++ triqs signal handler instead.
- undo all of this after the code runs, or in a case of exception.
- used python include, ceval.h, line 72 comments and below.
- reworked the triqs::signal_handler.
simple C like function, no object (no need).
start, stop, received, cf header file.
- clean the call_back.cpp : only place using the signal directly
(qmc uses the callback).
in particular, remove the old BOOST CHRONO, since
the std::chrono works fine on platforms we use now.
- a module can use the converters used by another
with the use_module('A') command.
In which case :
- the generate converter header for A will be included.
- the header, at the top, now contains a simple list of all
wrapped types, which is then included in the wrapped_types
of the module for proper code generation.
- simplify the code generation : just generate_code.
- all arguments are analyzed from sys.argv at the import of the
wrap_generator module. In any case, the xx_desc.py will be called from the corresponding
cmake command, hence with the right arguments.
- Added a dependencies in my_module_B of wrap_test to show how to make
the dependencies in the cmake file, if needed.
- change the constructor wrapper.
- in the new method, leave the pointer _c to NULL.
- in the init, allocate it.
- It seems ok to leave the object in this non initialized state,
but that is not so clear from the doc.
Added check for this pointer == NULL in converters.
- Use a new buffered_function to replace the complicated generator code from ALPS.
- Clean the implementation of the random_generator
- update the documentation
- update to the new python wrapper (could not be done with the previous
version, because of lack of move constructor).