mirror of
https://github.com/triqs/dft_tools
synced 2024-12-22 04:13:47 +01:00
Converters clean up, new subgroup names
* Provided script update_archive.py to convert old h5 archives. * Fixed all tests
This commit is contained in:
parent
4ae17571a9
commit
8bbbe81c7d
@ -27,7 +27,7 @@ The only necessary parameter is the filename of the hdf5 archive. In addition, t
|
|||||||
* `use_lda_blocks`: If true, the structure of the density matrix is analysed at initialisation, and non-zero matrix elements
|
* `use_lda_blocks`: If true, the structure of the density matrix is analysed at initialisation, and non-zero matrix elements
|
||||||
are identified. The DMFT calculation is then restricted to these matrix elements, yielding a more efficient solution of the
|
are identified. The DMFT calculation is then restricted to these matrix elements, yielding a more efficient solution of the
|
||||||
local interaction problem. Degeneracies in orbital and spin space are also identified and stored for later use. The default value is `False`.
|
local interaction problem. Degeneracies in orbital and spin space are also identified and stored for later use. The default value is `False`.
|
||||||
* `lda_data`, `symm_corr_data`, `par_proj_data`, `symm_par_data`, `bands_data`: These string variables define the subgroups in the hdf5 arxive,
|
* `lda_data`, `symmcorr_data`, `parproj_data`, `symmpar_data`, `bands_data`: These string variables define the subgroups in the hdf5 arxive,
|
||||||
where the corresponding information is stored. The default values are consistent with those in :ref:`interfacetowien`.
|
where the corresponding information is stored. The default values are consistent with those in :ref:`interfacetowien`.
|
||||||
|
|
||||||
At initialisation, the necessary data is read from the hdf5 file. If a calculation is restarted based on a previous hdf5 file, information on
|
At initialisation, the necessary data is read from the hdf5 file. If a calculation is restarted based on a previous hdf5 file, information on
|
||||||
|
@ -35,9 +35,9 @@ There are three optional parameters to the Constructor:
|
|||||||
|
|
||||||
* `lda_subgrp`: We store all data in subgroups of the hdf5 arxive. For the main data
|
* `lda_subgrp`: We store all data in subgroups of the hdf5 arxive. For the main data
|
||||||
that is needed for the DMFT loop, we use the subgroup specified by this optional parameter.
|
that is needed for the DMFT loop, we use the subgroup specified by this optional parameter.
|
||||||
The default value `SumK_LDA` is used as the subgroup name.
|
The default value `lda_input` is used as the subgroup name.
|
||||||
* `symm_subgrp`: In this subgroup we store all the data for applying the symmetry
|
* `symmcorr_subgrp`: In this subgroup we store all the data for applying the symmetry
|
||||||
operations in the DMFT loop. The default value is `SymmCorr`.
|
operations in the DMFT loop. The default value is `lda_symmcorr_input`.
|
||||||
* `repacking`: If true, and the hdf5 file already exists, the system command :program:`h5repack`
|
* `repacking`: If true, and the hdf5 file already exists, the system command :program:`h5repack`
|
||||||
is invoked. This command ensures a minimal file size of the hdf5
|
is invoked. This command ensures a minimal file size of the hdf5
|
||||||
file. The default value is `False`. If you wish to use this, ensure
|
file. The default value is `False`. If you wish to use this, ensure
|
||||||
@ -70,8 +70,8 @@ of :program:`Wien2k`, you have to use::
|
|||||||
This reads the files :file:`material_of_interest.parproj` and :file:`material_of_interest.sympar`.
|
This reads the files :file:`material_of_interest.parproj` and :file:`material_of_interest.sympar`.
|
||||||
Again, there are two optional parameters
|
Again, there are two optional parameters
|
||||||
|
|
||||||
* `par_proj_subgrp`: The subgroup for partial projectors data. The default value is `SumK_LDA_ParProj`.
|
* `parproj_subgrp`: The subgroup for partial projectors data. The default value is `lda_parproj_input`.
|
||||||
* `symm_par_subgrp`: The subgroup for symmetry operations data. The default value is `SymmPar`.
|
* `symmpar_subgrp`: The subgroup for symmetry operations data. The default value is `lda_symmpar_input`.
|
||||||
|
|
||||||
Another routine of the class allows to read the input for plotting the momentum-resolved
|
Another routine of the class allows to read the input for plotting the momentum-resolved
|
||||||
spectral function. It is done by::
|
spectral function. It is done by::
|
||||||
@ -79,7 +79,7 @@ spectral function. It is done by::
|
|||||||
Converter.convert_bands_input()
|
Converter.convert_bands_input()
|
||||||
|
|
||||||
The optional parameter that controls where the data is stored is `bands_subgrp`,
|
The optional parameter that controls where the data is stored is `bands_subgrp`,
|
||||||
with the default value `SumK_LDA_Bands`.
|
with the default value `lda_bands_input`.
|
||||||
|
|
||||||
After having converted this input, you can further proceed with the :ref:`analysis`.
|
After having converted this input, you can further proceed with the :ref:`analysis`.
|
||||||
|
|
||||||
|
@ -11,7 +11,7 @@ hdf5 data format
|
|||||||
|
|
||||||
In order to be used with the DMFT routines, the following data needs to be provided in the hdf5 file. It contains a lot of information in order to perform DMFT calculations for all kinds of situations, e.g. d-p Hamiltonians, more than one correlated atomic shell, or using symmetry operations for the k-summation. We store all data in subgroups of the hdf5 arxive:
|
In order to be used with the DMFT routines, the following data needs to be provided in the hdf5 file. It contains a lot of information in order to perform DMFT calculations for all kinds of situations, e.g. d-p Hamiltonians, more than one correlated atomic shell, or using symmetry operations for the k-summation. We store all data in subgroups of the hdf5 arxive:
|
||||||
|
|
||||||
:program:`Main data`: There needs to be one subgroup for the main data of the calculation. The default name of this group is `SumK_LDA`. Its contents are
|
:program:`Main data`: There needs to be one subgroup for the main data of the calculation. The default name of this group is `lda_input`. Its contents are
|
||||||
|
|
||||||
* `energy_unit`, numpy.float. The unit of energy used for the calculation
|
* `energy_unit`, numpy.float. The unit of energy used for the calculation
|
||||||
|
|
||||||
|
@ -9,6 +9,32 @@ Changed the following:
|
|||||||
* Gupf -> G_upfold
|
* Gupf -> G_upfold
|
||||||
* read_symmetry_input -> convert_symmetry_input
|
* read_symmetry_input -> convert_symmetry_input
|
||||||
|
|
||||||
|
**********
|
||||||
|
* changed default h5 subgroup names
|
||||||
|
|
||||||
|
SumK_LDA -> dft_input
|
||||||
|
dft_band_input
|
||||||
|
SymmCorr -> dft_symmcorr_input
|
||||||
|
|
||||||
|
SumK_LDA_ParProj -> dft_parproj_input
|
||||||
|
SymmPar -> dft_symmpar_input
|
||||||
|
|
||||||
|
def __init__(self, filename, lda_subgrp = 'SumK_LDA', symm_subgrp = 'SymmCorr', repacking = False):
|
||||||
|
-->
|
||||||
|
def __init__(self, filename, lda_subgrp = 'dft_input', symm_subgrp = 'dft_symm_input', repacking = False):
|
||||||
|
|
||||||
|
declare all groupnames in init
|
||||||
|
|
||||||
|
symm_subgrp -> symmcorr_subgrp
|
||||||
|
symm_par_subgrp -> symmpar_subgrp
|
||||||
|
par_proj_subgrp -> parproj_subgrp
|
||||||
|
|
||||||
|
symm_data -> symmcorr_data
|
||||||
|
par_proj_data -> parproj_data
|
||||||
|
symm_par_data -> symmpar_data
|
||||||
|
|
||||||
|
**********
|
||||||
|
|
||||||
* moved find_dc, find_mu_nonint, check_projectors, sorts_of_atoms,
|
* moved find_dc, find_mu_nonint, check_projectors, sorts_of_atoms,
|
||||||
number_of_atoms to end, not to be documented.
|
number_of_atoms to end, not to be documented.
|
||||||
* replaced all instances of
|
* replaced all instances of
|
||||||
|
@ -43,7 +43,7 @@ class HkConverter:
|
|||||||
Conversion from general H(k) file to an hdf5 file that can be used as input for the SumK_LDA class.
|
Conversion from general H(k) file to an hdf5 file that can be used as input for the SumK_LDA class.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, hk_file, hdf_file, lda_subgrp = 'SumK_LDA', symm_subgrp = 'SymmCorr', repacking = False):
|
def __init__(self, hk_file, hdf_file, lda_subgrp = 'lda_input', symmcorr_subgrp = 'lda_symmcorr_input', repacking = False):
|
||||||
"""
|
"""
|
||||||
Init of the class.
|
Init of the class.
|
||||||
on.
|
on.
|
||||||
@ -53,20 +53,18 @@ class HkConverter:
|
|||||||
self.hdf_file = hdf_file
|
self.hdf_file = hdf_file
|
||||||
self.lda_file = hk_file
|
self.lda_file = hk_file
|
||||||
self.lda_subgrp = lda_subgrp
|
self.lda_subgrp = lda_subgrp
|
||||||
self.symm_subgrp = symm_subgrp
|
self.symmcorr_subgrp = symmcorr_subgrp
|
||||||
|
|
||||||
# Checks if h5 file is there and repacks it if wanted:
|
# Checks if h5 file is there and repacks it if wanted:
|
||||||
import os.path
|
import os.path
|
||||||
if (os.path.exists(self.hdf_file) and repacking):
|
if (os.path.exists(self.hdf_file) and repacking):
|
||||||
self.__repack()
|
self.__repack()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def convert_dmft_input(self, first_real_part_matrix = True, only_upper_triangle = False, weights_in_file = False):
|
def convert_dmft_input(self, first_real_part_matrix = True, only_upper_triangle = False, weights_in_file = False):
|
||||||
"""
|
"""
|
||||||
Reads the input files, and stores the data in the HDFfile
|
Reads the input files, and stores the data in the HDFfile
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
# Read and write only on the master node
|
# Read and write only on the master node
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
@ -121,10 +119,8 @@ class HkConverter:
|
|||||||
[0.0, 1.0/sqrt(2.0), 0.0, -1.0/sqrt(2.0), 0.0],
|
[0.0, 1.0/sqrt(2.0), 0.0, -1.0/sqrt(2.0), 0.0],
|
||||||
[0.0, 1.0/sqrt(2.0), 0.0, 1.0/sqrt(2.0), 0.0]])
|
[0.0, 1.0/sqrt(2.0), 0.0, 1.0/sqrt(2.0), 0.0]])
|
||||||
|
|
||||||
|
|
||||||
# Spin blocks to be read:
|
# Spin blocks to be read:
|
||||||
n_spin_blocs = SP + 1 - SO # number of spins to read for Norbs and Ham, NOT Projectors
|
n_spin_blocs = SP + 1 - SO # number of spins to read for Norbs and Ham, NOT Projectors
|
||||||
|
|
||||||
|
|
||||||
# define the number of n_orbitals for all k points: it is the number of total bands and independent of k!
|
# define the number of n_orbitals for all k points: it is the number of total bands and independent of k!
|
||||||
n_orb = sum([ shells[ish][3] for ish in range(n_shells) ])
|
n_orb = sum([ shells[ish][3] for ish in range(n_shells) ])
|
||||||
@ -133,7 +129,6 @@ class HkConverter:
|
|||||||
# Initialise the projectors:
|
# Initialise the projectors:
|
||||||
proj_mat = numpy.zeros([n_k,n_spin_blocs,n_corr_shells,max(numpy.array(corr_shells)[:,3]),max(n_orbitals)],numpy.complex_)
|
proj_mat = numpy.zeros([n_k,n_spin_blocs,n_corr_shells,max(numpy.array(corr_shells)[:,3]),max(n_orbitals)],numpy.complex_)
|
||||||
|
|
||||||
|
|
||||||
# Read the projectors from the file:
|
# Read the projectors from the file:
|
||||||
for ik in xrange(n_k):
|
for ik in xrange(n_k):
|
||||||
for icrsh in range(n_corr_shells):
|
for icrsh in range(n_corr_shells):
|
||||||
@ -151,8 +146,6 @@ class HkConverter:
|
|||||||
|
|
||||||
proj_mat[ik,isp,icrsh,0:no,offset:offset+no] = numpy.identity(no)
|
proj_mat[ik,isp,icrsh,0:no,offset:offset+no] = numpy.identity(no)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# now define the arrays for weights and hopping ...
|
# now define the arrays for weights and hopping ...
|
||||||
bz_weights = numpy.ones([n_k],numpy.float_)/ float(n_k) # w(k_index), default normalisation
|
bz_weights = numpy.ones([n_k],numpy.float_)/ float(n_k) # w(k_index), default normalisation
|
||||||
hopping = numpy.zeros([n_k,n_spin_blocs,max(n_orbitals),max(n_orbitals)],numpy.complex_)
|
hopping = numpy.zeros([n_k,n_spin_blocs,max(n_orbitals),max(n_orbitals)],numpy.complex_)
|
||||||
@ -208,13 +201,10 @@ class HkConverter:
|
|||||||
raise "HK Converter : reading file lda_file failed!"
|
raise "HK Converter : reading file lda_file failed!"
|
||||||
|
|
||||||
R.close()
|
R.close()
|
||||||
|
|
||||||
#-----------------------------------------
|
# Save to the HDF5:
|
||||||
# Store the input into HDF5:
|
|
||||||
ar = HDFArchive(self.hdf_file,'a')
|
ar = HDFArchive(self.hdf_file,'a')
|
||||||
if not (self.lda_subgrp in ar): ar.create_group(self.lda_subgrp)
|
if not (self.lda_subgrp in ar): ar.create_group(self.lda_subgrp)
|
||||||
# The subgroup containing the data. If it does not exist, it is created.
|
|
||||||
# If it exists, the data is overwritten!!!
|
|
||||||
things_to_save = ['energy_unit','n_k','k_dep_projection','SP','SO','charge_below','density_required',
|
things_to_save = ['energy_unit','n_k','k_dep_projection','SP','SO','charge_below','density_required',
|
||||||
'symm_op','n_shells','shells','n_corr_shells','corr_shells','use_rotations','rot_mat',
|
'symm_op','n_shells','shells','n_corr_shells','corr_shells','use_rotations','rot_mat',
|
||||||
'rot_mat_time_inv','n_reps','dim_reps','T','n_orbitals','proj_mat','bz_weights','hopping']
|
'rot_mat_time_inv','n_reps','dim_reps','T','n_orbitals','proj_mat','bz_weights','hopping']
|
||||||
|
@ -42,20 +42,25 @@ class Wien2kConverter:
|
|||||||
Conversion from Wien2k output to an hdf5 file that can be used as input for the SumkLDA class.
|
Conversion from Wien2k output to an hdf5 file that can be used as input for the SumkLDA class.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, filename, lda_subgrp = 'SumK_LDA', symm_subgrp = 'SymmCorr', repacking = False):
|
def __init__(self, filename, lda_subgrp = 'lda_input', symmcorr_subgrp = 'lda_symmcorr_input',
|
||||||
|
parproj_subgrp='lda_parproj_input', symmpar_subgrp='lda_symmpar_input',
|
||||||
|
bands_subgrp = 'lda_bands_input', repacking = False):
|
||||||
"""
|
"""
|
||||||
Init of the class. Variable filename gives the root of all filenames, e.g. case.ctqmcout, case.h5, and so on.
|
Init of the class. Variable filename gives the root of all filenames, e.g. case.ctqmcout, case.h5, and so on.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
assert type(filename)==StringType,"LDA_file must be a filename"
|
assert type(filename)==StringType, "Please provide the LDA files' base name as a string."
|
||||||
self.hdf_file = filename+'.h5'
|
self.hdf_file = filename+'.h5'
|
||||||
self.lda_file = filename+'.ctqmcout'
|
self.lda_file = filename+'.ctqmcout'
|
||||||
self.symm_file = filename+'.symqmc'
|
self.symmcorr_file = filename+'.symqmc'
|
||||||
self.parproj_file = filename+'.parproj'
|
self.parproj_file = filename+'.parproj'
|
||||||
self.symmpar_file = filename+'.sympar'
|
self.symmpar_file = filename+'.sympar'
|
||||||
self.band_file = filename+'.outband'
|
self.band_file = filename+'.outband'
|
||||||
self.lda_subgrp = lda_subgrp
|
self.lda_subgrp = lda_subgrp
|
||||||
self.symm_subgrp = symm_subgrp
|
self.symmcorr_subgrp = symmcorr_subgrp
|
||||||
|
self.parproj_subgrp = parproj_subgrp
|
||||||
|
self.symmpar_subgrp = symmpar_subgrp
|
||||||
|
self.bands_subgrp = bands_subgrp
|
||||||
|
|
||||||
# Checks if h5 file is there and repacks it if wanted:
|
# Checks if h5 file is there and repacks it if wanted:
|
||||||
import os.path
|
import os.path
|
||||||
@ -69,7 +74,6 @@ class Wien2kConverter:
|
|||||||
Reads the input files, and stores the data in the HDFfile
|
Reads the input files, and stores the data in the HDFfile
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
# Read and write only on the master node
|
# Read and write only on the master node
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
mpi.report("Reading input from %s..."%self.lda_file)
|
mpi.report("Reading input from %s..."%self.lda_file)
|
||||||
@ -96,7 +100,7 @@ class Wien2kConverter:
|
|||||||
# now read the information about the shells:
|
# now read the information about the shells:
|
||||||
corr_shells = [ [ int(R.next()) for i in range(6) ] for icrsh in range(n_corr_shells) ] # reads iatom, sort, l, dim, SO flag, irep
|
corr_shells = [ [ int(R.next()) for i in range(6) ] for icrsh in range(n_corr_shells) ] # reads iatom, sort, l, dim, SO flag, irep
|
||||||
|
|
||||||
self.inequiv_shells(corr_shells) # determine the number of inequivalent correlated shells, has to be known for further reading...
|
self.inequiv_shells(corr_shells) # determine the number of inequivalent correlated shells, needed for further reading
|
||||||
|
|
||||||
use_rotations = 1
|
use_rotations = 1
|
||||||
rot_mat = [numpy.identity(corr_shells[icrsh][3],numpy.complex_) for icrsh in xrange(n_corr_shells)]
|
rot_mat = [numpy.identity(corr_shells[icrsh][3],numpy.complex_) for icrsh in xrange(n_corr_shells)]
|
||||||
@ -115,8 +119,6 @@ class Wien2kConverter:
|
|||||||
if (SP==1): # read time inversion flag:
|
if (SP==1): # read time inversion flag:
|
||||||
rot_mat_time_inv[icrsh] = int(R.next())
|
rot_mat_time_inv[icrsh] = int(R.next())
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Read here the info for the transformation of the basis:
|
# Read here the info for the transformation of the basis:
|
||||||
n_reps = [1 for i in range(self.n_inequiv_corr_shells)]
|
n_reps = [1 for i in range(self.n_inequiv_corr_shells)]
|
||||||
dim_reps = [0 for i in range(self.n_inequiv_corr_shells)]
|
dim_reps = [0 for i in range(self.n_inequiv_corr_shells)]
|
||||||
@ -138,23 +140,19 @@ class Wien2kConverter:
|
|||||||
for i in xrange(lmax):
|
for i in xrange(lmax):
|
||||||
for j in xrange(lmax):
|
for j in xrange(lmax):
|
||||||
T[icrsh][i,j] += 1j * R.next()
|
T[icrsh][i,j] += 1j * R.next()
|
||||||
|
|
||||||
|
|
||||||
# Spin blocks to be read:
|
# Spin blocks to be read:
|
||||||
n_spin_blocs = SP + 1 - SO
|
n_spin_blocs = SP + 1 - SO
|
||||||
|
|
||||||
|
|
||||||
# read the list of n_orbitals for all k points
|
# read the list of n_orbitals for all k points
|
||||||
n_orbitals = numpy.zeros([n_k,n_spin_blocs],numpy.int)
|
n_orbitals = numpy.zeros([n_k,n_spin_blocs],numpy.int)
|
||||||
for isp in range(n_spin_blocs):
|
for isp in range(n_spin_blocs):
|
||||||
for ik in xrange(n_k):
|
for ik in xrange(n_k):
|
||||||
n_orbitals[ik,isp] = int(R.next())
|
n_orbitals[ik,isp] = int(R.next())
|
||||||
|
|
||||||
|
|
||||||
# Initialise the projectors:
|
# Initialise the projectors:
|
||||||
proj_mat = numpy.zeros([n_k,n_spin_blocs,n_corr_shells,max(numpy.array(corr_shells)[:,3]),max(n_orbitals)],numpy.complex_)
|
proj_mat = numpy.zeros([n_k,n_spin_blocs,n_corr_shells,max(numpy.array(corr_shells)[:,3]),max(n_orbitals)],numpy.complex_)
|
||||||
|
|
||||||
|
|
||||||
# Read the projectors from the file:
|
# Read the projectors from the file:
|
||||||
for ik in xrange(n_k):
|
for ik in xrange(n_k):
|
||||||
for icrsh in range(n_corr_shells):
|
for icrsh in range(n_corr_shells):
|
||||||
@ -169,7 +167,6 @@ class Wien2kConverter:
|
|||||||
for i in xrange(no):
|
for i in xrange(no):
|
||||||
for j in xrange(n_orbitals[ik][isp]):
|
for j in xrange(n_orbitals[ik][isp]):
|
||||||
proj_mat[ik,isp,icrsh,i,j] += 1j * R.next()
|
proj_mat[ik,isp,icrsh,i,j] += 1j * R.next()
|
||||||
|
|
||||||
|
|
||||||
# now define the arrays for weights and hopping ...
|
# now define the arrays for weights and hopping ...
|
||||||
bz_weights = numpy.ones([n_k],numpy.float_)/ float(n_k) # w(k_index), default normalisation
|
bz_weights = numpy.ones([n_k],numpy.float_)/ float(n_k) # w(k_index), default normalisation
|
||||||
@ -183,7 +180,7 @@ class Wien2kConverter:
|
|||||||
bz_weights[:] /= sm
|
bz_weights[:] /= sm
|
||||||
|
|
||||||
# Grab the H
|
# Grab the H
|
||||||
# we use now the convention of a DIAGONAL Hamiltonian!!!!
|
# we use now the convention of a DIAGONAL Hamiltonian -- convention for Wien2K.
|
||||||
for isp in range(n_spin_blocs):
|
for isp in range(n_spin_blocs):
|
||||||
for ik in xrange(n_k) :
|
for ik in xrange(n_k) :
|
||||||
no = n_orbitals[ik,isp]
|
no = n_orbitals[ik,isp]
|
||||||
@ -197,36 +194,29 @@ class Wien2kConverter:
|
|||||||
raise "Wien2k_converter : reading file lda_file failed!"
|
raise "Wien2k_converter : reading file lda_file failed!"
|
||||||
|
|
||||||
R.close()
|
R.close()
|
||||||
|
# Reading done!
|
||||||
|
|
||||||
#-----------------------------------------
|
# Save it to the HDF:
|
||||||
# Store the input into HDF5:
|
|
||||||
ar = HDFArchive(self.hdf_file,'a')
|
ar = HDFArchive(self.hdf_file,'a')
|
||||||
if not (self.lda_subgrp in ar): ar.create_group(self.lda_subgrp)
|
if not (self.lda_subgrp in ar): ar.create_group(self.lda_subgrp)
|
||||||
# The subgroup containing the data. If it does not exist, it is created.
|
# The subgroup containing the data. If it does not exist, it is created. If it exists, the data is overwritten!
|
||||||
# If it exists, the data is overwritten!!!
|
|
||||||
things_to_save = ['energy_unit','n_k','k_dep_projection','SP','SO','charge_below','density_required',
|
things_to_save = ['energy_unit','n_k','k_dep_projection','SP','SO','charge_below','density_required',
|
||||||
'symm_op','n_shells','shells','n_corr_shells','corr_shells','use_rotations','rot_mat',
|
'symm_op','n_shells','shells','n_corr_shells','corr_shells','use_rotations','rot_mat',
|
||||||
'rot_mat_time_inv','n_reps','dim_reps','T','n_orbitals','proj_mat','bz_weights','hopping']
|
'rot_mat_time_inv','n_reps','dim_reps','T','n_orbitals','proj_mat','bz_weights','hopping']
|
||||||
for it in things_to_save: ar[self.lda_subgrp][it] = locals()[it]
|
for it in things_to_save: ar[self.lda_subgrp][it] = locals()[it]
|
||||||
del ar
|
del ar
|
||||||
|
|
||||||
|
# Symmetries are used, so now convert symmetry information for *correlated* orbitals:
|
||||||
# Symmetries are used,
|
self.convert_symmetry_input(orbits=corr_shells,symm_file=self.symmcorr_file,symm_subgrp=self.symmcorr_subgrp,SO=self.SO,SP=self.SP)
|
||||||
# Now do the symmetries for correlated orbitals:
|
|
||||||
self.convert_symmetry_input(orbits=corr_shells,symm_file=self.symm_file,symm_subgrp=self.symm_subgrp,SO=SO,SP=SP)
|
|
||||||
|
|
||||||
|
|
||||||
def convert_parproj_input(self, par_proj_subgrp='SumK_LDA_ParProj', symm_par_subgrp='SymmPar'):
|
def convert_parproj_input(self):
|
||||||
"""
|
"""
|
||||||
Reads the input for the partial charges projectors from case.parproj, and stores it in the symm_par_subgrp
|
Reads the input for the partial charges projectors from case.parproj, and stores it in the symmpar_subgrp
|
||||||
group in the HDF5.
|
group in the HDF5.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
|
|
||||||
self.par_proj_subgrp = par_proj_subgrp
|
|
||||||
self.symm_par_subgrp = symm_par_subgrp
|
|
||||||
|
|
||||||
mpi.report("Reading parproj input from %s..."%self.parproj_file)
|
mpi.report("Reading parproj input from %s..."%self.parproj_file)
|
||||||
|
|
||||||
dens_mat_below = [ [numpy.zeros([self.shells[ish][3],self.shells[ish][3]],numpy.complex_) for ish in range(self.n_shells)]
|
dens_mat_below = [ [numpy.zeros([self.shells[ish][3],self.shells[ish][3]],numpy.complex_) for ish in range(self.n_shells)]
|
||||||
@ -247,8 +237,8 @@ class Wien2kConverter:
|
|||||||
# read first the projectors for this orbital:
|
# read first the projectors for this orbital:
|
||||||
for ik in xrange(self.n_k):
|
for ik in xrange(self.n_k):
|
||||||
for ir in range(n_parproj[ish]):
|
for ir in range(n_parproj[ish]):
|
||||||
|
|
||||||
for isp in range(self.n_spin_blocs):
|
for isp in range(self.n_spin_blocs):
|
||||||
|
|
||||||
for i in xrange(self.shells[ish][3]): # read real part:
|
for i in xrange(self.shells[ish][3]): # read real part:
|
||||||
for j in xrange(self.n_orbitals[ik][isp]):
|
for j in xrange(self.n_orbitals[ik][isp]):
|
||||||
proj_mat_pc[ik,isp,ish,ir,i,j] = R.next()
|
proj_mat_pc[ik,isp,ish,ir,i,j] = R.next()
|
||||||
@ -278,37 +268,31 @@ class Wien2kConverter:
|
|||||||
for j in xrange(self.shells[ish][3]):
|
for j in xrange(self.shells[ish][3]):
|
||||||
rot_mat_all[ish][i,j] += 1j * R.next()
|
rot_mat_all[ish][i,j] += 1j * R.next()
|
||||||
|
|
||||||
#print Dens_Mat_below[0][ish],Dens_Mat_below[1][ish]
|
|
||||||
|
|
||||||
if (self.SP):
|
if (self.SP):
|
||||||
rot_mat_all_time_inv[ish] = int(R.next())
|
rot_mat_all_time_inv[ish] = int(R.next())
|
||||||
|
|
||||||
R.close()
|
R.close()
|
||||||
|
# Reading done!
|
||||||
|
|
||||||
#-----------------------------------------
|
# Save it to the HDF:
|
||||||
# Store the input into HDF5:
|
|
||||||
ar = HDFArchive(self.hdf_file,'a')
|
ar = HDFArchive(self.hdf_file,'a')
|
||||||
if not (self.par_proj_subgrp in ar): ar.create_group(self.par_proj_subgrp)
|
if not (self.parproj_subgrp in ar): ar.create_group(self.parproj_subgrp)
|
||||||
# The subgroup containing the data. If it does not exist, it is created.
|
# The subgroup containing the data. If it does not exist, it is created. If it exists, the data is overwritten!
|
||||||
# If it exists, the data is overwritten!!!
|
|
||||||
things_to_save = ['dens_mat_below','n_parproj','proj_mat_pc','rot_mat_all','rot_mat_all_time_inv']
|
things_to_save = ['dens_mat_below','n_parproj','proj_mat_pc','rot_mat_all','rot_mat_all_time_inv']
|
||||||
for it in things_to_save: ar[self.par_proj_subgrp][it] = locals()[it]
|
for it in things_to_save: ar[self.parproj_subgrp][it] = locals()[it]
|
||||||
del ar
|
del ar
|
||||||
|
|
||||||
# Symmetries are used,
|
# Symmetries are used, so now convert symmetry information for *all* orbitals:
|
||||||
# Now do the symmetries for all orbitals:
|
self.convert_symmetry_input(orbits=self.shells,symm_file=self.symmpar_file,symm_subgrp=self.symmpar_subgrp,SO=self.SO,SP=self.SP)
|
||||||
self.convert_symmetry_input(orbits=self.shells,symm_file=self.symmpar_file,symm_subgrp=self.symm_par_subgrp,SO=self.SO,SP=self.SP)
|
|
||||||
|
|
||||||
|
|
||||||
def convert_bands_input(self, bands_subgrp = 'SumK_LDA_Bands'):
|
def convert_bands_input(self):
|
||||||
"""
|
"""
|
||||||
Converts the input for momentum resolved spectral functions, and stores it in bands_subgrp in the
|
Converts the input for momentum resolved spectral functions, and stores it in bands_subgrp in the
|
||||||
HDF5.
|
HDF5.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
|
|
||||||
self.bands_subgrp = bands_subgrp
|
|
||||||
mpi.report("Reading bands input from %s..."%self.band_file)
|
mpi.report("Reading bands input from %s..."%self.band_file)
|
||||||
|
|
||||||
R = read_fortran_file(self.band_file)
|
R = read_fortran_file(self.band_file)
|
||||||
@ -375,15 +359,12 @@ class Wien2kConverter:
|
|||||||
raise "Wien2k_converter : reading file band_file failed!"
|
raise "Wien2k_converter : reading file band_file failed!"
|
||||||
|
|
||||||
R.close()
|
R.close()
|
||||||
# reading done!
|
# Reading done!
|
||||||
|
|
||||||
#-----------------------------------------
|
# Save it to the HDF:
|
||||||
# Store the input into HDF5:
|
|
||||||
ar = HDFArchive(self.hdf_file,'a')
|
ar = HDFArchive(self.hdf_file,'a')
|
||||||
if not (self.bands_subgrp in ar): ar.create_group(self.bands_subgrp)
|
if not (self.bands_subgrp in ar): ar.create_group(self.bands_subgrp)
|
||||||
|
# The subgroup containing the data. If it does not exist, it is created. If it exists, the data is overwritten!
|
||||||
# The subgroup containing the data. If it does not exist, it is created.
|
|
||||||
# If it exists, the data is overwritten!!!
|
|
||||||
things_to_save = ['n_k','n_orbitals','proj_mat','hopping','n_parproj','proj_mat_pc']
|
things_to_save = ['n_k','n_orbitals','proj_mat','hopping','n_parproj','proj_mat_pc']
|
||||||
for it in things_to_save: ar[self.bands_subgrp][it] = locals()[it]
|
for it in things_to_save: ar[self.bands_subgrp][it] = locals()[it]
|
||||||
del ar
|
del ar
|
||||||
@ -396,7 +377,6 @@ class Wien2kConverter:
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
|
|
||||||
mpi.report("Reading symmetry input from %s..."%symm_file)
|
mpi.report("Reading symmetry input from %s..."%symm_file)
|
||||||
|
|
||||||
n_orbits = len(orbits)
|
n_orbits = len(orbits)
|
||||||
@ -443,6 +423,7 @@ class Wien2kConverter:
|
|||||||
raise "Wien2k_converter : reading file symm_file failed!"
|
raise "Wien2k_converter : reading file symm_file failed!"
|
||||||
|
|
||||||
R.close()
|
R.close()
|
||||||
|
# Reading done!
|
||||||
|
|
||||||
# Save it to the HDF:
|
# Save it to the HDF:
|
||||||
ar=HDFArchive(self.hdf_file,'a')
|
ar=HDFArchive(self.hdf_file,'a')
|
||||||
@ -461,14 +442,13 @@ class Wien2kConverter:
|
|||||||
import subprocess
|
import subprocess
|
||||||
|
|
||||||
if not (mpi.is_master_node()): return
|
if not (mpi.is_master_node()): return
|
||||||
|
|
||||||
mpi.report("Repacking the file %s"%self.hdf_file)
|
mpi.report("Repacking the file %s"%self.hdf_file)
|
||||||
|
|
||||||
retcode = subprocess.call(["h5repack","-i%s"%self.hdf_file, "-otemphgfrt.h5"])
|
return_code = subprocess.call(["h5repack", "-i %s"%self.hdf_file, "-o temphgfrt.h5"])
|
||||||
if (retcode!=0):
|
if (return_code != 0):
|
||||||
mpi.report("h5repack failed!")
|
mpi.report("h5repack failed!")
|
||||||
else:
|
else:
|
||||||
subprocess.call(["mv","-f","temphgfrt.h5","%s"%self.hdf_file])
|
subprocess.call(["mv", "-f", "temphgfrt.h5", "%s"%self.hdf_file])
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -32,8 +32,8 @@ class SumkLDA:
|
|||||||
"""This class provides a general SumK method for combining ab-initio code and pytriqs."""
|
"""This class provides a general SumK method for combining ab-initio code and pytriqs."""
|
||||||
|
|
||||||
|
|
||||||
def __init__(self, hdf_file, mu = 0.0, h_field = 0.0, use_lda_blocks = False, lda_data = 'SumK_LDA', symm_corr_data = 'SymmCorr',
|
def __init__(self, hdf_file, mu = 0.0, h_field = 0.0, use_lda_blocks = False, lda_data = 'lda_input', symmcorr_data = 'lda_symmcorr_input',
|
||||||
par_proj_data = 'SumK_LDA_ParProj', symm_par_data = 'SymmPar', bands_data = 'SumK_LDA_Bands'):
|
parproj_data = 'lda_parproj_input', symmpar_data = 'lda_symmpar_input', bands_data = 'lda_bands_input'):
|
||||||
"""
|
"""
|
||||||
Initialises the class from data previously stored into an HDF5
|
Initialises the class from data previously stored into an HDF5
|
||||||
"""
|
"""
|
||||||
@ -43,10 +43,10 @@ class SumkLDA:
|
|||||||
else:
|
else:
|
||||||
self.hdf_file = hdf_file
|
self.hdf_file = hdf_file
|
||||||
self.lda_data = lda_data
|
self.lda_data = lda_data
|
||||||
self.par_proj_data = par_proj_data
|
self.symmcorr_data = symmcorr_data
|
||||||
|
self.parproj_data = parproj_data
|
||||||
|
self.symmpar_data = symmpar_data
|
||||||
self.bands_data = bands_data
|
self.bands_data = bands_data
|
||||||
self.symm_par_data = symm_par_data
|
|
||||||
self.symm_corr_data = symm_corr_data
|
|
||||||
self.block_names = [ ['up','down'], ['ud'] ]
|
self.block_names = [ ['up','down'], ['ud'] ]
|
||||||
self.n_spin_blocks_gf = [2,1]
|
self.n_spin_blocks_gf = [2,1]
|
||||||
self.G_upfold = None
|
self.G_upfold = None
|
||||||
@ -105,7 +105,7 @@ class SumkLDA:
|
|||||||
|
|
||||||
if self.symm_op:
|
if self.symm_op:
|
||||||
#mpi.report("Do the init for symm:")
|
#mpi.report("Do the init for symm:")
|
||||||
self.Symm_corr = Symmetry(hdf_file,subgroup=self.symm_corr_data)
|
self.Symm_corr = Symmetry(hdf_file,subgroup=self.symmcorr_data)
|
||||||
|
|
||||||
# Analyse the block structure and determine the smallest blocs, if desired
|
# Analyse the block structure and determine the smallest blocs, if desired
|
||||||
if (use_lda_blocks): dm=self.analyse_BS()
|
if (use_lda_blocks): dm=self.analyse_BS()
|
||||||
|
@ -44,13 +44,13 @@ class SumkLDATools(SumkLDA):
|
|||||||
"""Extends the SumkLDA class with some tools for analysing the data."""
|
"""Extends the SumkLDA class with some tools for analysing the data."""
|
||||||
|
|
||||||
|
|
||||||
def __init__(self, hdf_file, mu = 0.0, h_field = 0.0, use_lda_blocks = False, lda_data = 'SumK_LDA', symm_corr_data = 'SymmCorr',
|
def __init__(self, hdf_file, mu = 0.0, h_field = 0.0, use_lda_blocks = False, lda_data = 'lda_input', symmcorr_data = 'lda_symmcorr_input',
|
||||||
par_proj_data = 'SumK_LDA_ParProj', symm_par_data = 'SymmPar', bands_data = 'SumK_LDA_Bands'):
|
parproj_data = 'lda_parproj_input', symmpar_data = 'lda_symmpar_input', bands_data = 'lda_bands_input'):
|
||||||
|
|
||||||
self.G_upfold_refreq = None
|
self.G_upfold_refreq = None
|
||||||
SumkLDA.__init__(self,hdf_file=hdf_file,mu=mu,h_field=h_field,use_lda_blocks=use_lda_blocks,lda_data=lda_data,
|
SumkLDA.__init__(self, hdf_file=hdf_file, mu=mu, h_field=h_field, use_lda_blocks=use_lda_blocks,
|
||||||
symm_corr_data=symm_corr_data,par_proj_data=par_proj_data,symm_par_data=symm_par_data,
|
lda_data=lda_data, symmcorr_data=symmcorr_data, parproj_data=parproj_data,
|
||||||
bands_data=bands_data)
|
symmpar_data=symmpar_data, bands_data=bands_data)
|
||||||
|
|
||||||
|
|
||||||
def downfold_pc(self,ik,ir,ish,sig,gf_to_downfold,gf_inp):
|
def downfold_pc(self,ik,ir,ish,sig,gf_to_downfold,gf_inp):
|
||||||
@ -237,13 +237,13 @@ class SumkLDATools(SumkLDA):
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
def read_par_proj_input_from_hdf(self):
|
def read_parproj_input_from_hdf(self):
|
||||||
"""
|
"""
|
||||||
Reads the data for the partial projectors from the HDF file
|
Reads the data for the partial projectors from the HDF file
|
||||||
"""
|
"""
|
||||||
|
|
||||||
things_to_read = ['dens_mat_below','n_parproj','proj_mat_pc','rot_mat_all','rot_mat_all_time_inv']
|
things_to_read = ['dens_mat_below','n_parproj','proj_mat_pc','rot_mat_all','rot_mat_all_time_inv']
|
||||||
read_value = self.read_input_from_hdf(subgrp=self.par_proj_data,things_to_read = things_to_read)
|
read_value = self.read_input_from_hdf(subgrp=self.parproj_data,things_to_read = things_to_read)
|
||||||
return read_value
|
return read_value
|
||||||
|
|
||||||
|
|
||||||
@ -254,10 +254,10 @@ class SumkLDATools(SumkLDA):
|
|||||||
assert hasattr(self,"Sigma_imp"), "Set Sigma First!!"
|
assert hasattr(self,"Sigma_imp"), "Set Sigma First!!"
|
||||||
|
|
||||||
#things_to_read = ['Dens_Mat_below','N_parproj','Proj_Mat_pc','rotmat_all']
|
#things_to_read = ['Dens_Mat_below','N_parproj','Proj_Mat_pc','rotmat_all']
|
||||||
#read_value = self.read_input_from_HDF(SubGrp=self.par_proj_data, things_to_read=things_to_read)
|
#read_value = self.read_input_from_HDF(SubGrp=self.parproj_data, things_to_read=things_to_read)
|
||||||
read_value = self.read_par_proj_input_from_hdf()
|
read_value = self.read_parproj_input_from_hdf()
|
||||||
if not read_value: return read_value
|
if not read_value: return read_value
|
||||||
if self.symm_op: self.Symm_par = Symmetry(self.hdf_file,subgroup=self.symm_par_data)
|
if self.symm_op: self.Symm_par = Symmetry(self.hdf_file,subgroup=self.symmpar_data)
|
||||||
|
|
||||||
mu = self.chemical_potential
|
mu = self.chemical_potential
|
||||||
|
|
||||||
@ -515,10 +515,10 @@ class SumkLDATools(SumkLDA):
|
|||||||
|
|
||||||
|
|
||||||
#things_to_read = ['Dens_Mat_below','N_parproj','Proj_Mat_pc','rotmat_all']
|
#things_to_read = ['Dens_Mat_below','N_parproj','Proj_Mat_pc','rotmat_all']
|
||||||
#read_value = self.read_input_from_HDF(SubGrp=self.par_proj_data,things_to_read=things_to_read)
|
#read_value = self.read_input_from_HDF(SubGrp=self.parproj_data,things_to_read=things_to_read)
|
||||||
read_value = self.read_par_proj_input_from_hdf()
|
read_value = self.read_parproj_input_from_hdf()
|
||||||
if not read_value: return read_value
|
if not read_value: return read_value
|
||||||
if self.symm_op: self.Symm_par = Symmetry(self.hdf_file,subgroup=self.symm_par_data)
|
if self.symm_op: self.Symm_par = Symmetry(self.hdf_file,subgroup=self.symmpar_data)
|
||||||
|
|
||||||
# Density matrix in the window
|
# Density matrix in the window
|
||||||
bln = self.block_names[self.SO]
|
bln = self.block_names[self.SO]
|
||||||
|
35
python/update_archive.py
Normal file
35
python/update_archive.py
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
import h5py
|
||||||
|
import sys
|
||||||
|
import numpy
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
if len(sys.argv) < 2:
|
||||||
|
print "Usage: python update_archive.py old_archive"
|
||||||
|
sys.exit()
|
||||||
|
|
||||||
|
print """
|
||||||
|
This script is an attempt to update your archive to TRIQS 1.2.
|
||||||
|
Please keep a copy of your old archive as this script is
|
||||||
|
** not guaranteed ** to work for your archive.
|
||||||
|
If you encounter any problem please report it on github!
|
||||||
|
"""
|
||||||
|
|
||||||
|
filename = sys.argv[1]
|
||||||
|
A = h5py.File(filename)
|
||||||
|
|
||||||
|
old_to_new = {'SumK_LDA':'lda_input', 'SumK_LDA_ParProj':'lda_parproj_input',
|
||||||
|
'SymmCorr':'lda_symmcorr_input', 'SymmPar':'lda_symmpar_input', 'SumK_LDA_Bands':'lda_bands_input'}
|
||||||
|
|
||||||
|
for old, new in old_to_new.iteritems():
|
||||||
|
if old not in A.keys(): continue
|
||||||
|
print "Changing %s to %s ..."%(old, new)
|
||||||
|
A.copy(old,new)
|
||||||
|
del(A[old])
|
||||||
|
A.close()
|
||||||
|
|
||||||
|
# Repack to reclaim disk space
|
||||||
|
retcode = subprocess.call(["h5repack","-i%s"%filename, "-otemphgfrt.h5"])
|
||||||
|
if retcode != 0:
|
||||||
|
print "h5repack failed!"
|
||||||
|
else:
|
||||||
|
subprocess.call(["mv","-f","temphgfrt.h5","%s"%filename])
|
BIN
test/SrVO3.h5
BIN
test/SrVO3.h5
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading…
Reference in New Issue
Block a user