File: credits.rst

package info (click to toggle)
amp 0.6.1-1
  • links: PTS, VCS
  • area: main
  • in suites: bookworm, bullseye, buster
  • size: 8,396 kB
  • sloc: python: 9,629; f90: 3,195; makefile: 58
file content (41 lines) | stat: -rw-r--r-- 1,475 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
.. Amp documentation master file, created by
   sphinx-quickstart on Thu Jul 30 17:27:50 2015.
   You can adapt this file completely to your liking, but it should at least
   contain the root `toctree` directive.

Credits
=======

People
------

This project is developed primarily by **Andrew Peterson** and **Alireza Khorshidi** in the Brown University School of Engineering. Specific credits:

* Andrew Peterson: lead, PI, many modules
* Alireza Khorshidi: many modules, Zernike descriptor
* Zack Ulissi: tensorflow version of neural network
* Muammar El Khatib: general contributions

We are also indebted to Nongnuch Artrith (MIT) and Pedro Felzenszwalb (Brown) for inspiration and technical discussion.

Citations
---------

We would appreciate if you cite the below publication for any use of Amp or its methods:

    Khorshidi & Peterson, "Amp: A modular approach to machine learning in atomistic simulations", *Computer Physics Communications* 207:310-324, 2016. |amp_paper|


.. |amp_paper| raw:: html

   <a href="http://dx.doi.org/10.1016/j.cpc.2016.05.010" target="_blank">[doi:10.1016/j.cpc.2016.05.010] </a>


If you use Amp for saddle-point searches or nudged elastic bands, please also cite:

    Peterson, "Acceleration of saddle-point searches with machine learning", *Journal of Chemical Physics*, 145:074106, 2016. |mlneb_paper|


.. |mlneb_paper| raw:: html

   <a href="http://dx.doi.org/10.1063/1.4960708" target="_blank">[DOI:10.1063/1.4960708] </a>