In this section, we describe the ultrafast carrier dynamics. In contrast to other PERTURBO calculation modes, here we solve the real-time Boltzman transport equation (rt-BTE):

\[\frac{\partial f_{n\mathbf{k}}(t)}{\partial t} = \frac{e\mathbf{E}}{\hbar}\cdot \nabla _\mathbf{k}f_{n\mathbf{k}}(t)+\mathcal{I}^{e-\mathrm{ph}}[f_{n\mathbf{k}}(t)],\]

where \(f_{n\mathbf{k}}(t)\) is the time-dependent electronic occupation of a Bloch state with band index \(n\) and crystal momentum \(\mathbf{k}\), and \(e\) is the electronic charge. The first term on the right-hand side, called the advection term, is proportional to the external electric field \(\mathbf{E}\) and is responsible for electron drift. The second term is the \(e\)-ph collision integral \(\mathcal{I}^{e-\mathrm{ph}}\) accounting for phonon absorption and emission processes.

First, we will consider the zero field ultrafast dynamics (\(\mathbf{E}=0\)), then the high-field dynamics (full rt-BTE) will be considered.

Zero-field ultrafast dynamics
calc_mode = ‘dynamics-run’

For the ultrafast dynamics, one first needs to perform the 'setup' calculation. So, we assume that the user has already performed this calculation. From the 'setup' calculation, we retain the following files necessary for the dynamics calculations: ‘prefix’.temper and ‘prefix’_tet.h5.

In a typical zero-field dynamics, at \(t=0\), one excites the electronic system with a Gaussian pulse, for example, and then follows the electron relaxation due to the electron-phonon scattering. The excited carrier concentration is preserved during the real-time simulation and the integral of the excitation pulse corresponds to the excited carrier density.

For the 'dynamics-run' calculation, specify the following variables in the input file (pert.in):

  • prefix: the same prefix as in ‘prefix’_epr.h5
  • calc_mode: set to 'dynamics-run'
  • boltz_kdim: \(\mathbf{k}\) grid for electrons, here we use a 80x80x80 grid
  • boltz_qdim: \(\mathbf{q}\) grid for phonons, specify if it is different from the \(\mathbf{k}\) grid, here we use the same \(\mathbf{q}\) grid as \(\mathbf{k}\) grid
  • boltz_emin, boltz_emax: energy window (in eV units), use the same as in the 'setup' calculation (here, 6.4 and 7.4 eV)
  • band_min, band_max: band range
  • ftemper: the filename of a file containing temperature, chemical potential, and carrier concentration values (see the format)
  • time_step: simulation time step \(\Delta t\), here set to its typical value, 1 fs. Usually, a simulation will improve with shorter time steps but calculation time will increase.
  • boltz_nstep: total number of time steps, here we perform a relatively short simulation of 50 steps
  • output_nstep: an optional variable to shorten the output; the output time step \(\Delta t_{out}\) is determined in the following way: \(\Delta t_{out} = \texttt{output_nstep}\times \Delta t\)
  • solver: BTE solver type, set to 'euler', here we use the Euler first order solver of BTE
  • boltz_init_dist: the initial distribution of your carriers, here set to 'gaussian' for the Gaussian distribution. If you would like to restart the simulation, set boltz_init_dist='restart', then the distribution of the last step from the previous simulation will be used as the initial distribution.
  • boltz_init_e0: energy used to generate the initial distribution. For the Gaussian distribution used in this example it gives the location of the center of the Gaussian and is set to 7.0 eV.
  • boltz_init_smear: broadening or width of the initial distribution. For the Gaussian distribution used here it gives the width of the Gaussian and is selected as 40 meV.
  • phfreq_cutoff: cutoff below which a phonon will not be included in the calculation, here set to 1 meV phonon energy.
  • delta_smear: the broadening to model the Dirac delta function over energy difference, here chosen to be 8 meV.

Here, we use the boltz_nstep parameter to control the number of time steps (here, set to 50). Alternatively, the following parameters can be used to control the number of time steps:

  • boltz_acc_thr: drift acceleration threshold in units of cm s-2; if specified, the real-time simulation will stop when at least 10 last iterations had a drift acceleration lower than the threshold
  • boltz_nstep_min: only applied if boltz_acc_thr is specified; minimum number of time steps taken before real-time simulation stops (even if the drift acceleration threshold criteria has already been met)

Here is the input file (pert.in):

&perturbo
 prefix      = 'si'
 calc_mode   = 'dynamics-run'

 boltz_kdim(1) = 80
 boltz_kdim(2) = 80
 boltz_kdim(3) = 80

 boltz_emin = 6.4
 boltz_emax = 7.4
 band_min = 5
 band_max = 6

 ftemper  = 'si.temper'

 time_step   = 1 !fs
 boltz_nstep = 50
 output_nstep = 2
 solver = 'euler'

 boltz_init_dist = 'gaussian'
 boltz_init_e0 = 7.0 ! eV
 boltz_init_smear = 40 !meV

 tmp_dir = "./tmp"
 phfreq_cutoff = 1 ! meV
 delta_smear = 8 ! meV
/

In this example, we calculate the evolution of the electron distribution. In order to perform the hole dynamics, set the parameter hole to true.

Run perturbo.x (remember to link or copy ‘prefix’_epr.h5 in the current directory):

$ export OMP_NUM_THREADS=4
$ mpirun -n 8 <perturbo_bin>/perturbo.x -npools 8 -i pert.in > pert.out

We obtain the ‘prefix’_cdyna.h5 HDF5 output file (this file can be also found in the “References” directory). This file contains all the necessary output information about the performed simulation. This file is organized as follows:

  • band_structure_ryd: electronic bandstructure in Ry; each column corresponds to the band index \(n\) \((~\texttt{band_min}\leq n \leq \texttt{band_max})\)
  • dynamics_run_[i]: an HDF5 group that contains information about the i th simulation.
    If the simulation was restarted (boltz_init_dist='restart') one or more times, one will have several dynamics_run_[i] groups, otherwise, only dynamics_run_1 will be present. A group dynamics_run_[i] is structured as follows:
    • num_steps: the number of output time steps (taking into account output_nstep), can be different for different dynamics_run_[i]

    • snap_t_0: \(f_{n\mathbf{k}}(t_0)\)
      \(\vdots\)
    • snap_t_[j]: the distribution function \(f_{n\mathbf{k}}\) for time \(t_j\): \(t_j = t_0 + j\Delta t_{out}\), where \(\Delta t_{out}\) is the output time step. Each column of the array corresponds to the band index \(n\)
      \(\vdots\)
    • snap_t_[num_steps]: \(f_{n\mathbf{k}}(t_{\texttt{num_steps}})\)

    • time_step_fs: the output time step \(\Delta t_{out}\) (can be different for different dynamics_run_[i])
  • num_runs: total number of performed simulations (corresponds to the number of dynamics_run_[i] groups).


The ‘prefix’_cdyna.h5 file structure can be schematically represented as follows:

diagram_hdf5_dynamics-run

We also obtain an outputted YAML file, here called si_dynamics-run.yml, containing the inputs and outputs of the 'dynamics-run' calculation. The sections that are common to all Perturbo calculations are described here.

An example of the fields unique to the 'dynamics-run' calculation is given below.

grid:
   number of  tetrahedra selected:     380160
   number of reducible k-points:      72165
   number of irreducible k-points:       1797


scatter parallelization:

   MPI task index:

      1:
         number of q-points:          254451
         number of (k,q) pairs:        48597938

      2:
         number of q-points:          254484
         number of (k,q) pairs:        48606145
     ......

dynamics-run:

Because most data is outputted to the HDF5 files, very little data is outputted to the si_dynamics-run.yml file. The grid field gives information on the number of tetrahedra, reducible \(\mathbf{k}\) points, and irreducible \(\mathbf{k}\) points in the energy window of interest. The specific \(\mathbf{k}\) points and tetrahedra are given in the ‘prefix’_tet.h5 file. The scatter parallelization field gives information on the number of \(\mathbf{k}\) points and their distribution across the MPI tasks. The dynamics-run field is empty, as the results are outputted to the ‘prefix’_cdyna.h5 HDF5 file.

We may next use Perturbopy to export the data to Python for postprocessing. For more details, see the Perturbopy website, which explains how Perturbopy processes the HDF5 files and can be used to quickly generate plots. Alternatively, users can process HDF5 files using their own Python scripts with Python package h5py. As an example, we present here a simple Python script that visualizes the distribution function for the time \(t_5\) of the simulation and for the first band (in the selected band range).

#!/usr/bin/env python3
import h5py
import matplotlib.pyplot as plt

prefix='si'
snap_number=5
band_index=0

# load the HDF5 file
h5file = h5py.File(prefix+'_cdyna.h5', 'r')

# get the data
ryd2ev = h5file['band_structure_ryd'].attrs['ryd2ev']
energy_ev = h5file['band_structure_ryd'][:,band_index] * ryd2ev
dist_func = h5file['dynamics_run_1']['snap_t_'+str(snap_number)][:,band_index]
h5file.close()

# plot the data
plt.plot(energy_ev,dist_func,marker='o',linestyle='')
plt.xlabel('Energy (eV)')
plt.ylabel('Distribution function')
plt.show()

In order to postprocess this file using perturbo.x, see the dynamics-pp section.

We may also want to run the calculation including only certain phonon modes to determine the effects of different modes. To do this, we can use the optional parameter ph_mode_exclude_ranges, which excludes phonon modes in the rt-BTE by setting their |g2| factors to 0. For example:

  • to exclude modes from 2 to 5, write in the input file: ph_mode_exclude_ranges(1,:)=2,5. The index 1 reflects that this is the first range to exclude.
  • to also exclude modes from 7 to 10, add another line in the input file: ph_mode_exclude_ranges(2,:)=7, 10. The index 2 reflects that this is the second range to exclude.
  • to also exclude just one mode (say, mode 12), write: ph_mode_exclude_ranges(3,:)=12, 12 .. The index 3 reflects that this is the third range to exclude.

Note that the maximum number of ranges is 30, and the indices of the phonon modes to exclude must be smaller than the total number of modes in a system.

Dynamics post-processing
calc_mode = ‘dynamics-pp’

In this section we aim to calculate the Brillouin zone-averaged energy-dependent carrier population \(\bar{f}(E,t)\). Having calculated the distribution function \(f_{n\mathbf{k}}(t)\), one can find \(\bar{f}(E,t)\) in the following way:

\[\bar{f}(E,t) = \sum_{n\mathbf{k}} f_{n\mathbf{k}}(t) \delta(\epsilon_{n\mathbf{k}}-E).\]

The integral of \(\bar{f}(E,t)\) over the energy gives the number of carriers per unit cell as a function of time.

In order to calculate the \(\bar{f}(E,t)\) quantity, one needs to have all the files required for the calc_mode='dynamics-run' calculation (previous section) and the HDF5 output file ‘prefix’_cdyna.h5 from the dynamics-run calculation. To perform the postprocessing, use a similar to the previous section input file, but change the calculation mode to calc_mode='dynamics-pp'. Run perturbo.x (remember to link or copy ‘prefix’_epr.h5 in the current directory):

$ mpirun -n 1 <perturbo_bin>/perturbo.x -npools 1 -i pert.in > pert.out

On the output, we obtain the following files:

  • si_popu.h5: an HDF5 file that contains all the necessary information for \(\bar{f}(E,t)\)
  • si_cdyna.dat: an ASCII file containing the number of carriers per unit cell as a function of time

The si_popu.h5 HDF5 file is organized as follows:

  • energy_distribution: a group that contains the populations for all the time instants of the dynamics-run simulation

    • popu_t1: \(\bar{f}(E,t_1)\)
      \(\vdots\)
    • popu_t[j]: the carrier population \(\bar{f}(E,t_j)\) at time \(t_j\)
      \(\vdots\)
    • popu_t[num_steps+1]: \(\bar{f}(E,t_{\texttt{num_steps+1}})\)

  • energy_grid_ev: the grid of energies in eV; the number of energy grid points is given by \(\frac{ \texttt{emax} - \texttt{emin} }{ \texttt{boltz_de} }+\texttt{3}\)
  • times_fs: the array of time instants in fs


The si_popu.h5 HDF5 file can be schematically represented as follows:

diagram_hdf5_dynamics-pp

We also obtain an outputted YAML file, here called si_dynamics-pp.yml, containing the inputs and outputs of the 'dynamics-pp' calculation. The sections that are common to all Perturbo calculations are described here.

An example of the fields unique to the 'dynamics-pp' calculation is given below.

grid:
   number of  tetrahedra selected:     380160
   number of reducible k-points:      72165
   number of irreducible k-points:       1797


dynamics-pp:

   time units: fs

   concentation units: num. of carriers/u.c.

   number of snaps:      25

   time:
      -  0.0000000000000000E+00
      ......
      -  5.0000000000000000E+01

   concentration:
      -  1.1419980691444462E-03
      ......
      -  1.1419968066740211E-03

Again, the grid field gives information on the number of tetrahedra, reducible \(\mathbf{k}\) points, and irreducible \(\mathbf{k}\) points in the energy window of interest. The specific \(\mathbf{k}\) points and tetrahedra are given in the ‘prefix’_tet.h5 file. The dynamics-pp field gives, under the time fields and concentration fields respectively, the time in fs and carrier concentration per unit cell (integral of \(\bar{f}(E,t)\) over the energy) at each of the time steps (here, there are 25).

We may next use Perturbopy to export the data to Python for postprocessing. For more details, see the Perturbopy website, which explains how Perturbopy processes the output and can be used to quickly generate plots. Alternatively, users can process HDF5 files using their own scripts (i.e. Python, Julia, or other high-level languages). For example, we provide here a simplistic Python script showing an example how to manipulate this HDF5 file. To plot the electron population for the time \(t_{25}\), run:

#!/usr/bin/env python3
import h5py
import matplotlib.pyplot as plt

prefix='si'
snap_number=25

# load the HDF5 file
h5file = h5py.File(prefix+'_popu.h5', 'r')

# get the data
energy_ev = h5file['energy_grid_ev'][()]
population = h5file['energy_distribution']['popu_t'+str(snap_number)][()]
h5file.close()

# plot the data
plt.plot(energy_ev,population,marker='o',linestyle='')
plt.xlabel('Energy (eV)')
plt.ylabel('Electron population')
plt.show()

It is also convenient to postprocess and visualize the data in HDF5 file using other high level languages, such as Julia. For example, the following Julia script does the same thing as the above Python script:

using HDF5, Plots

prefix = "si"
fname = prefix * "_popu.h5"
snap_number = 25

# read the data
energy_ev = h5read(fname, "energy_grid_ev")
population = h5read(fname, "energy_distribution/popu_t"*string(snap_number))

# plot
plot(energy_ev, population, xlabel="Energy (eV)", ylabel="Electron population")

High-field ultrafast dynamics
calc_mode = ‘dynamics-run’

The steps for running high-field ultrafast dynamics are similar to those for zero-field ultrafast dynamics, with the following modifications to the input script.

  • boltz_efield: to specify the external electric field in V cm-1. This parameter is three-dimensional. Here we modify only the x component of the external field, and leave the other two components as zero.
  • boltz_norm_dist: normalizes the distribution function at each time step. Not necessary for zero electric field, but set to true here for a high-field calculation.

Here is the input file (pert.in):

&perturbo
 prefix      = 'si'
 calc_mode   = 'dynamics-run'

 boltz_kdim(1) = 80
 boltz_kdim(2) = 80
 boltz_kdim(3) = 80

 boltz_efield(1)      = 100.0
 boltz_efield(2)      = 0.0
 boltz_efield(3)      = 0.0

 boltz_emin = 6.4
 boltz_emax = 7.4
 band_min = 5
 band_max = 6

 ftemper  = 'si.temper'

 time_step   = 1 !fs
 boltz_nstep = 50
 output_nstep = 2
 solver = 'euler'

 boltz_init_dist = 'gaussian'
 boltz_init_e0 = 7.0 ! eV
 boltz_init_smear = 40 !meV

 tmp_dir = "./tmp"
 phfreq_cutoff = 1 ! meV
 delta_smear = 8 ! meV
/

We obtain the ‘prefix’_cdyna.h5 HDF5 output file, which is organized almost the same as the file shown above. However, it has the following additional field under the dynamics_run_[i] group:

  • dynamics_run_[i]
    • efield: the x, y, and z components of the external electric field in V cm-1

We also obtain the ‘prefix’_dynamics-run.yml output file, which is identical to the example above.

We may next use Perturbopy to export the data to Python for postprocessing. For more details, see the Perturbopy website

High-field ultrafast dynamics
calc_mode = ‘dynamics-pp’

The steps for running high-field ultrafast dynamics post-processing are identical to those for zero-field ultrafast dynamics. Please refer to the section above for details.

After running the calculation, we obtain the similar ‘prefix’_popu.h5, described above. The prefix_dynamics-pp.yml file described above is also analogous, but has additional fields to specify the drift velocities.

dynamics-pp:

   time units: fs

   concentration units: num. of carriers/u.c.

   number of snaps:      25

   electric field: [ 100.00000,     0.00000,     0.00000,  ]

   velocity units: cm/s

   time:
      -  0.0000000000000000E+00
      ......
      -  5.0000000000000000E+01

   concentration:
      -  1.1419980691444460E-03
      ......
      -  1.1419975237966593E-03

   velocity:
      -  3.4859250185988308E-03
      ......
      - -1.5315116904290144E+04

Like in the zero-field case, the dynamics-pp field gives, under the time fields and concentration fields respectively, the time in fs and carrier concentration per unit cell at each of the time steps. There is now an additional field velocity, which gives the drift velocity, in cm s-1 units, at each time. It also gives the electric field, though please note that this echos the boltz_efield input in the 'dynamics-pp' input file and may not be meaningful if this value differs from that of the 'dynamics-run' input file.

In addition, it is useful to note that the drift velocity will not be printed if the electric field is set to zero. It may also be that the drift velocity is not exactly zero even in a setting where it should be. This is due to numerical precision and can be disregarded.

We may next use Perturbopy to export the data to Python for postprocessing. For more details, see the Perturbopy website

For more information, see the hands-on 4 from the 2023 Perturbo workshop.