[med-svn] [pycorrfit] 01/03: Imported Upstream version 0.8.3

Alex Mestiashvili malex-guest at moszumanska.debian.org
Tue Apr 15 12:41:45 UTC 2014


This is an automated email from the git hooks/post-receive script.

malex-guest pushed a commit to branch master
in repository pycorrfit.

commit cc4ac82249d34822bdca0099724103b9f6889965
Author: Alexandre Mestiashvili <alex at biotec.tu-dresden.de>
Date:   Tue Apr 15 10:36:33 2014 +0200

    Imported Upstream version 0.8.3
---
 ChangeLog.txt                                      |  18 +
 PyCorrFit_doc.pdf                                  | Bin 568711 -> 581216 bytes
 README.md                                          |   4 +-
 Readme.txt                                         |  31 +-
 doc-src/Bibliography.bib                           |  97 +++++-
 doc-src/Images/PyCorrFit_Screenshot_CSFCS.png      | Bin 123086 -> 121034 bytes
 doc-src/Images/PyCorrFit_Screenshot_Main.png       | Bin 89638 -> 90374 bytes
 doc-src/PyCorrFit_doc_content.tex                  |  69 ++--
 doc-src/PyCorrFit_doc_models.tex                   |  10 +-
 doc-src/README.md                                  |  23 ++
 ...C.fcsfit-session.zip => CSFCS_DiO-in-DOPC.pcfs} | Bin 387189 -> 386691 bytes
 ...session.zip => ConfocalFCS_Alexa488_xcorr.pcfs} | Bin 71711 -> 69822 bytes
 src/PyCorrFit.py                                   |  43 ++-
 src/__init__.py                                    |   2 +-
 src/doc.py                                         |  47 ++-
 src/edclasses.py                                   |  24 +-
 src/{leastsquaresfit.py => fitting.py}             | 167 +++++++---
 src/frontend.py                                    | 166 +++++++---
 src/models/MODEL_TIRF_gaussian_1C.py               | 167 +++++++++-
 src/models/MODEL_TIRF_gaussian_3D2D.py             |   9 +-
 src/models/MODEL_TIRF_gaussian_3D3D.py             |  11 +-
 src/models/MODEL_classic_gaussian_2D.py            |  14 +-
 src/models/MODEL_classic_gaussian_3D.py            |  22 +-
 src/models/MODEL_classic_gaussian_3D2D.py          |  19 +-
 src/models/__init__.py                             |  32 +-
 src/openfile.py                                    |  23 +-
 src/page.py                                        | 133 +++++---
 src/plotting.py                                    | 100 ++++--
 src/readfiles/__init__.py                          |  28 +-
 src/readfiles/read_ASC_ALV.py                      | 368 +++++++++++++++++++++
 src/readfiles/read_ASC_ALV_6000.py                 | 177 ----------
 src/readfiles/read_FCS_Confocor3.py                |  41 ++-
 src/tools/__init__.py                              |  17 +
 src/tools/average.py                               |  36 +-
 src/tools/background.py                            |  11 +-
 src/tools/batchcontrol.py                          |  27 +-
 src/tools/datarange.py                             |   4 +-
 src/tools/example.py                               |   9 +-
 src/tools/globalfit.py                             |  11 +-
 src/tools/info.py                                  |  22 +-
 src/tools/overlaycurves.py                         |  35 +-
 src/tools/parmrange.py                             |  11 +-
 src/tools/plotexport.py                            |   9 +-
 src/tools/simulation.py                            |  12 +-
 src/tools/statistics.py                            | 107 +++---
 src/tools/trace.py                                 |  24 +-
 src/usermodel.py                                   |   3 +-
 47 files changed, 1577 insertions(+), 606 deletions(-)

diff --git a/ChangeLog.txt b/ChangeLog.txt
index abdd46d..1e1ed03 100644
--- a/ChangeLog.txt
+++ b/ChangeLog.txt
@@ -1,3 +1,21 @@
+0.8.3
+- New .pcfs (PyCorrFit Session) file format (#60)
+- Additional fitting algorithms: Nelder-Mead, BFGS, Powell, Polak-Ribiere (#71)
+- Improvements
+   - Massive speed-up when working with large data sets (#77)
+   - Plot export: legend position and displayed parameters (#54)
+   - Average tool: traces may now start at timepoints != 0
+   - Statistics tool: display on smaller screens
+   - ALV data files: updated parser to identify curve types and segment traces
+   - Zeiss ConfoCor3 data files: some files could not be opened due to dummy AC curves
+   - Models: default parameters were changed to prevent unstable fits
+   - Software: notification dialogs for missing modules or other software
+- Bugfixes
+   - User could accidently clear a session (#65)
+   - wxPython plotting problen on MacOSx (#64)
+   - Statistics view: some parameters were duplicated (#76)
+   - Catched zero-division warnings (models with triplet component)
+   - Corrected x-axis scaling of statistics view and trace view
 0.8.2
 - The documentation has been thoroughly reworked
 - The user is now warned if he does not have a TeX distribution installed
diff --git a/PyCorrFit_doc.pdf b/PyCorrFit_doc.pdf
index 194e7d3..6d3afe5 100644
Binary files a/PyCorrFit_doc.pdf and b/PyCorrFit_doc.pdf differ
diff --git a/README.md b/README.md
index cacf31a..1c314a2 100644
--- a/README.md
+++ b/README.md
@@ -18,6 +18,6 @@ information, visit the official homepage at http://pycorrfit.craban.de.
 
 - [Download the latest version](https://github.com/paulmueller/PyCorrFit/releases)  
 - [Documentation](https://github.com/paulmueller/PyCorrFit/raw/master/PyCorrFit_doc.pdf)
-- [Run PyCorrFit from source](https://github.com/paulmueller/PyCorrFit/wiki/Running-PyCorrFit-from-source)
-- [Write model functions](https://github.com/paulmueller/PyCorrFit/wiki/Writing-model-functions)
+- [Run PyCorrFit from source](https://github.com/paulmueller/PyCorrFit/wiki/Running-from-source)
+- [Write your own model functions](https://github.com/paulmueller/PyCorrFit/wiki/Writing-model-functions)
 - [Need help?](https://github.com/paulmueller/PyCorrFit/wiki/Creating-a-new-issue)
diff --git a/Readme.txt b/Readme.txt
index 81624f6..77ad18f 100644
--- a/Readme.txt
+++ b/Readme.txt
@@ -1,17 +1,26 @@
-Scientific tool for fitting correlation curves on a logarithmic plot.
+PyCorrFit can be used for fitting any data on a semi-log plot. The program focusses on 
+Fluorescence Correlation Spectroscopy (FCS) and comes with a couple of features that are 
+crucial for FCS data analysis:
 
-In current biomedical research, fluorescence correlation spectroscopy (FCS) is  applied
-to characterize molecular dynamic processes in vitro and in living cells.  Commercial
-FCS setups only permit data analysis that is limited to  a specific instrument by
-the use of in-house file formats or a  finite number of implemented correlation
-model functions. PyCorrFit is a general-purpose FCS evaluation software that,
-amongst other formats, supports the established Zeiss ConfoCor3 ~.fcs  file format.
-PyCorrFit comes with several built-in model functions, covering a wide range of
-applications in standard confocal FCS. In addition, it contains equations dealing
-with different excitation geometries like total internal reflection (TIR). For more
-information, visit the official homepage at http://pycorrfit.craban.de.
+- Averaging of curves
+- Background correction
+- Batch processing
+- Overlay tool to identify outliers
+- Fast simulation of model parameter behavior
+- Session management
+- User-defined model functions
+- High quality plot export using LaTeX (bitmap or vector graphics)
 
+For a full list of features and supported file formats visit http://pycorrfit.craban.de.
+There are also precompiled binaries for various systems.
 
+This package provides the Python module `pycorrfit` and its graphical user interface. The 
+graphical user interface is written with wxPython. A HowTo for the installation of the 
+latest version of PyCorrFit using pip can be found there:
+
+https://github.com/paulmueller/PyCorrFit/wiki/Installation_pip
+
+Further reading:
 - Latest downloads: https://github.com/paulmueller/PyCorrFit/releases   
 - Documentation: https://github.com/paulmueller/PyCorrFit/raw/master/PyCorrFit_doc.pdf   
 - Write model functions: https://github.com/paulmueller/PyCorrFit/wiki/Writing-model-functions   
diff --git a/doc-src/Bibliography.bib b/doc-src/Bibliography.bib
index 7da50fa..7dc3337 100755
--- a/doc-src/Bibliography.bib
+++ b/doc-src/Bibliography.bib
@@ -483,6 +483,17 @@
   Timestamp                = {2012.10.25}
 }
 
+ at Book{Nocedal2006,
+  Title                    = {Numerical Optimization},
+  Author                   = {Nocedal J. and Wright S J.},
+  Publisher                = {Springer Berlin Heidelberg},
+  Year                     = {2006},
+
+  Doi                      = {10.1007/978-3-540-35447-5},
+  Owner                    = {paul},
+  Timestamp                = {2014.03.31}
+}
+
 @Article{Jin2004,
   Title                    = {Near-surface velocimetry using evanescent wave illumination},
   Author                   = {Jin, S. and Huang, P. and Park, J. and Yoo, J. Y. and Breuer, K. S.},
@@ -643,6 +654,24 @@
   Timestamp                = {2012.09.21}
 }
 
+ at Article{Levenberg1944,
+  Title                    = {A method for the solution of certain non-linear problems in least squares},
+  Author                   = {Levenberg, Kenneth},
+  Journal                  = {Quarterly Journal of Applied Mathmatics},
+  Year                     = {1944},
+  Number                   = {2},
+  Pages                    = {164--168},
+  Volume                   = {II},
+
+  __markedentry            = {[paul:6]},
+  Citeulike-article-id     = {10796881},
+  Keywords                 = {indefinite, nonconvex, optimization},
+  Owner                    = {paul},
+  Posted-at                = {2012-06-17 09:00:21},
+  Priority                 = {2},
+  Timestamp                = {2014.03.31}
+}
+
 @Article{Lieto2003a,
   Title                    = {Ligand-Receptor Kinetics Measured by Total Internal Reflection with Fluorescence Correlation Spectroscopy},
   Author                   = {Lieto, Alena M. and Cush, Randall C. and Thompson, Nancy L.},
@@ -698,8 +727,6 @@
   Year                     = {2014},
   Pages                    = {635--51},
   Volume                   = {1076},
-
-  __markedentry            = {[paul:6]},
   Abstract                 = {Scanning fluorescence correlation spectroscopy (SFCS) with a scan path perpendicular to the membrane plane was introduced to measure diffusion and interactions of fluorescent components in free-standing biomembranes. Using a confocal laser scanning microscope (CLSM), the open detection volume is repeatedly scanned through the membrane at a kHz frequency. The fluorescence photons emitted from the detection volume are continuously recorded and stored in a file [...]
   Doi                      = {10.1007/978-1-62703-649-8_29},
   Owner                    = {paul},
@@ -760,15 +787,27 @@
   Year                     = {1999},
   Pages                    = {1619--1631},
   Volume                   = {76},
-  
+
+  Abstract                 = {The resolution limit of fluorescence correlation spectroscopy for two-component solutions is investigated theoretically and experimentally. The autocorrelation function for two different particles in solution were computed, statistical noise was added, and the resulting curve was fitted with a least squares fit. These simulations show that the ability to distinguish between two different molecular species in solution depends strongly on the number of photons [...]
   Doi                      = {10.1016/S0006-3495(99)77321-2},
   Keywords                 = {Diffusion, Multiple components},
-  
-  Abstract                 = {The resolution limit of fluorescence correlation spectroscopy for two-component solutions is investigated theoretically and experimentally. The autocorrelation function for two different particles in solution were computed, statistical noise was added, and the resulting curve was fitted with a least squares fit. These simulations show that the ability to distinguish between two different molecular species in solution depends strongly on the number of photons [...]
   Owner                    = {TW},
   Timestamp                = {2014.01.27}
 }
 
+ at Article{Nelder1965,
+  Title                    = {A simplex method for function minimization},
+  Author                   = {Nelder, John A and Mead, Roger},
+  Journal                  = {Computer journal},
+  Year                     = {1965},
+  Number                   = {4},
+  Pages                    = {308--313},
+  Volume                   = {7},
+  Doi                      = {10.1093/comjnl/7.4.308},
+  Owner                    = {paul},
+  Timestamp                = {2014.03.31}
+}
+
 @Article{Nitsche2004,
   Title                    = {A Transient Diffusion Model Yields Unitary Gap Junctional Permeabilities from Images of Cell-to-Cell Fluorescent Dye Transfer Between Xenopus Oocytes},
   Author                   = {Johannes M. Nitsche and Hou-Chien Chang and Paul A. Weber and Bruce J. Nicholson},
@@ -900,6 +939,37 @@
   Keyword                  = {Chemistry}
 }
 
+ at Article{Powell1964,
+  Title                    = {An efficient method for finding the minimum of a function of several variables without calculating derivatives},
+  Author                   = {Powell, M. J. D.},
+  Journal                  = {The Computer Journal},
+  Year                     = {1964},
+
+  Month                    = {Feb},
+  Number                   = {2},
+  Pages                    = {155–162},
+  Volume                   = {7},
+  Doi                      = {10.1093/comjnl/7.2.155},
+  ISSN                     = {1460-2067},
+  Owner                    = {paul},
+  Publisher                = {Oxford University Press (OUP)},
+  Timestamp                = {2014.03.31}
+}
+
+ at Article{Press,
+  Title                    = {Numerical recipes},
+  Author                   = {Press, William and Flannery, Brian P and Teukolsky, SAUL and Vetterling, WT},
+  Journal                  = {Cambridge University Press},
+  Year                     = {2006},
+  Pages                    = {989},
+  Volume                   = {1},
+
+  __markedentry            = {[paul:]},
+  Owner                    = {paul},
+  Publisher                = {Cambridge Univ Press},
+  Timestamp                = {2014.03.31}
+}
+
 @Article{Qian1991,
   Title                    = {Analysis of confocal laser-microscope optics for 3-D fluorescence correlation spectroscopy},
   Author                   = {Hong Qian and Elliot L. Elson},
@@ -1009,8 +1079,6 @@
   Number                   = {5},
   Pages                    = {1915--24},
   Volume                   = {91},
-
-  __markedentry            = {[paul:6]},
   Abstract                 = {Here we discuss the application of scanning fluorescence correlation spectroscopy (SFCS) using continuous wave excitation to analyze membrane dynamics. The high count rate per molecule enables the study of very slow diffusion in model and cell membranes, as well as the application of two-foci fluorescence cross-correlation spectroscopy for parameter-free determination of diffusion constants. The combination with dual-color fluorescence cross-correlation spec [...]
   Doi                      = {10.1529/biophysj.106.082297},
   Keywords                 = {Biological Transport, Active/physiology Cell Membrane/*chemistry/*metabolism Diffusion Membrane Proteins/analysis/*chemistry/*metabolism Spectrometry, Fluorescence/*methods Time Factors},
@@ -1421,8 +1489,6 @@
   Number                   = {4},
   Pages                    = {878--90},
   Volume                   = {2},
-
-  __markedentry            = {[paul:6]},
   Abstract                 = {Total internal reflection-fluorescence correlation spectroscopy (TIR-FCS) is an emerging technique that is used to measure events at or near an interface, including local fluorophore concentrations, local translational mobilities and the kinetic rate constants that describe the association and dissociation of fluorophores at the interface. TIR-FCS is also an extremely promising method for studying dynamics at or near the basal membranes of living cells. This [...]
   Doi                      = {10.1038/nprot.2007.110},
   Keywords                 = {Fluorescent Dyes/analysis Kinetics Ligands Spectrometry, Fluorescence/instrumentation/*methods},
@@ -1638,6 +1704,19 @@
   Timestamp                = {2012.11.07}
 }
 
+ at Conference{Wright1996,
+  Title                    = {Direct Search Methods: Once Scorned, Now Respectable},
+  Author                   = {Wright, M.H.},
+  Booktitle                = {Numerical Analysis},
+  Year                     = {1996},
+  Editor                   = {D.F. Griffiths and G.A. Watson},
+  Pages                    = {191-208},
+  Publisher                = {Addison Wesley Longman, Harlow, UK},
+
+  Owner                    = {paul},
+  Timestamp                = {2014.03.31}
+}
+
 @Article{Yordanov2009,
   Title                    = {Direct studies of liquid flows near solid surfaces by total internal reflection fluorescence cross-correlation spectroscopy},
   Author                   = {Stoyan Yordanov and Andreas Best and Hans-J\"{u}rgen Butt and Kaloian Koynov},
diff --git a/doc-src/Images/PyCorrFit_Screenshot_CSFCS.png b/doc-src/Images/PyCorrFit_Screenshot_CSFCS.png
index b2915e1..521cb3b 100644
Binary files a/doc-src/Images/PyCorrFit_Screenshot_CSFCS.png and b/doc-src/Images/PyCorrFit_Screenshot_CSFCS.png differ
diff --git a/doc-src/Images/PyCorrFit_Screenshot_Main.png b/doc-src/Images/PyCorrFit_Screenshot_Main.png
index a985c7d..e2c7376 100644
Binary files a/doc-src/Images/PyCorrFit_Screenshot_Main.png and b/doc-src/Images/PyCorrFit_Screenshot_Main.png differ
diff --git a/doc-src/PyCorrFit_doc_content.tex b/doc-src/PyCorrFit_doc_content.tex
index 5f6a894..c0e6ae0 100755
--- a/doc-src/PyCorrFit_doc_content.tex
+++ b/doc-src/PyCorrFit_doc_content.tex
@@ -57,21 +57,20 @@ PyCorrFit is available from the Debian repositories and can be installed via the
 \item\textbf{Pypi.} The program was written in Python, keeping the concept of cross-platform programming in mind. To run \textit{PyCorrFit} on any other operating system, the installation of Python v.2.7 is required. \textit{PyCorrFit} is included in the package index of \texttt{python-pip} (\url{http://pypi.python.org/pypi/pip}) and can be installed via
 \texttt{pip~install~pycorrfit}\footnote{See also the wiki article at \url{https://github.com/paulmueller/PyCorrFit/wiki/Installation_pip}}.
 \item \textbf{Sources.}
-You can also directly download the source code at any developmental stage. Visit \textit{PyCorrFit} at GitHub (\url{https://github.com/paulmueller/PyCorrFit}). \textit{PyCorrFit} depends on the following python modules:\\
+You can also directly download the source code at any developmental stage\footnote{See also the wiki article at \url{https://github.com/paulmueller/PyCorrFit/wiki/Running-from-source}}. \textit{PyCorrFit} depends on the following python modules:\\
 \texttt{\\
 python-matplotlib ($\geq$ 1.0.1) \\
 python-numpy ($\geq$ 1.5.1) \\
 python-scipy ($\geq$ 0.8.0) \\
 python-sympy ($\geq$ 0.7.2) \\
 python-yaml \\
-python-wxtools \\
-python-wxgtk2.8-dbg \\
+python-wxgtk2.8 \\
 }
 \end{itemize}
 
 
 \vspace{1em}
-\noindent \textbf{\LaTeX .} \textit{PyCorrFit} can save correlation curves as images using matplotlib. It is also possible to utilize \LaTeX to generate these plots. On Windows, installing MiKTeX  with ``automatic package download'' will enable this feature. On MacOSx, the MacTeX distribution can be used. On other systems, the packages \LaTeX \, dvipng, Ghostscript and the scientific \LaTeX \,packages \texttt{texlive-science} and \texttt{texlive-math-extra} need to be installed.
+\noindent \textbf{\LaTeX .} \textit{PyCorrFit} can save correlation curves as images using matplotlib. It is also possible to utilize \LaTeX to generate these plots. On Windows, installing MiKTeX  with ``automatic package download'' will enable this feature. On MacOSx, the MacTeX distribution can be used. On other systems, a latex distribution, Ghostscript, \texttt{dvipng} and the latex packages \texttt{texlive-latex-base} and \texttt{texlive-math-extra} need to be installed.
 
 \subsubsection{Running \textit{PyCorrFit}}
 \label{sec:intro.runni}
@@ -80,7 +79,7 @@ Download the executable file and double-click on the \texttt{PyCorrFit.exe} icon
 \paragraph*{Ubuntu/Debian.}
 \texttt{PyCorrFit} is integrated into Debian and thus behaves like any other application.
 \paragraph*{Mac OSx.}
-When downloading the archive \texttt{PyCorrFit.zip}, the binary should be extracted automatically (if not, extract the archive) and you can double-click it to run \textit{PyCorrFit}.
+When downloading the archive \texttt{PyCorrFit.zip}, the binary should be extracted automatically (if not, extract the archive) and you can double-click it to run \textit{PyCorrFit}.If prompted, select \textit{Terminal.app} from the \textit{utilities} folder. Drag the \textit{PyCorrFit} icon onto the Terminal, then MAcOS will assign Terminal.app as the standard application for opening this file (can be checked under properties).
 \paragraph*{from source.}
 Invoke \texttt{python PyCorrFit.py} from the command line.
 
@@ -105,16 +104,14 @@ The fitting itself is usually explored with a representative data set. Here, the
 
 \subsection{Graphical user interface (GUI)}
 \label{sec:intro.graph}
-
-Together with a system's terminal of the platform on which \textit{PyCorrFit} was installed (Windows, Linux, MacOS), the \textit{main window} opens when starting the program as described in \hyref{Section}{sec:intro.runni}. The window title bar contains the version of \textit{PyCorrFit} and, if a session was re-opened or saved, the name of the fitting session. A menu bar provides access to many supporting tools and additional information as thoroughly described in \hyref{Chapter}{sec:menub}. 
-
-There are three gateways for experimental data into a pre-existing or a new \textit{PyCorrFit} session (\textit{File/Load data}, \textit{File/Open session}, and \textit{Current page/Import data}). When a session has been opened or correlation data have been loaded, each correlation curve is displayed on a separate page of a notebook. For quick identification of the active data set, a tab specifies the page number, the correlated channels (AC/CC), and the run number in cases where  multip [...]
-
 \begin{figure}[h]
 \centering
 \includegraphics[width=\linewidth]{PyCorrFit_Screenshot_Main.png}
  \mycaption{user interface of PyCorrFit}{Confocal measurement of nanomolar Alexa488 in aqueous solution. To avoid after-pulsing, the autocorrelation curve was measured by cross-correlating signals from two detection channels using a 50 \% beamsplitter. Fitting reveals the average number of observed particles ($n \approx 6$) and their residence time in the detection volume ($\tau_{\rm diff} = \SI{28}{\mu s})$. \label{fig:mainwin} }
 \end{figure}
+Together with a system's terminal of the platform on which \textit{PyCorrFit} was installed (Windows, Linux, MacOS), the \textit{main window} opens when starting the program as described in \hyref{Section}{sec:intro.runni}. The window title bar contains the version of \textit{PyCorrFit} and, if a session was re-opened or saved, the name of the fitting session. A menu bar provides access to many supporting tools and additional information as thoroughly described in \hyref{Chapter}{sec:menub}. 
+
+There are three gateways for experimental data into a pre-existing or a new \textit{PyCorrFit} session (\textit{File/Load data}, \textit{File/Open session}, and \textit{Current page/Import data}). When a session has been opened or correlation data have been loaded, each correlation curve is displayed on a separate page of a notebook. For quick identification of the active data set, a tab specifies the page number, the correlated channels (AC/CC), and the run number in cases where  multip [...]
 
 The active page displaying a correlation function is divided in two panels (\hyref{Figure}{fig:mainwin}). At the left hand side the page shows a pile of boxes containing values or fitting options associated with the current model and data set: 
 
@@ -122,7 +119,11 @@ The active page displaying a correlation function is divided in two panels (\hyr
 \item \textit{Data set} specifies the assigned model abbreviation in parentheses and shows a unique identifier for the correlation curve containing the file name, the number of the ``run'', and the data channel. This string is automatically assembled during the loading procedure (\hyref{Section}{sec:menub.filem.loadd}). However, during the session it can be manually edited, thereby allowing to re-name or flag certain data during the fitting analysis.
 \item \textit{Model parameters} displays the values which determine the current shape of the assigned model function (\hyref{Chapter}{sec:theor}). Initially, starting values are loaded as they were defined in the model description (\hyref{Section}{sec:menub.filem.impor}). Little buttons allow a stepwise increase or decrease in units of 1/10\textsuperscript{th}. It is also possible to directly enter some numbers. A checkbox is used to set the parameter status to ``varied'' (checked) or `` [...]
 \item \textit{Amplitude corrections} applies additional rescaling to amplitude related parameters like the number of particles $n$ or amplitude fractions associated with different correlation times ($n_1$, $n_2$, etc.). Experimental values of non-correlated background intensity can be manually entered for each channel. In addition, the correlation curves can be normalized, to facilitate a visual comparison of the decay.
-\item \textit{Fitting options} offers weighted fitting. The underlying idea is that data points with higher accuracy should also have a higher impact on model parameters. To derive weights, \textit{PyCorrFit} calculates the variance of the difference between the actual data and a smooth, empiric representation of the curve for a certain neighbourhood. The number of neighbouring data points at each side ($j > 0$) can be set. For such a smooth representation a  spline function or the model [...]
+\item \textit{Fitting options} offers weighted fitting (a) and a choice for the fit algorithm (b).
+\begin{itemize}
+\item[\textbf{a)}] The underlying idea is that data points with higher accuracy should also have a higher impact on model parameters. To derive weights, \textit{PyCorrFit} calculates the variance of the difference between the actual data and a smooth, empiric representation of the curve for a certain neighbourhood. The number of neighbouring data points at each side ($j > 0$) can be set. For such a smooth representation a  spline function or the model function with the current parameter  [...]
+\item[\textbf{b)}] Several fitting algorithms can be chosen. We recommend to use the Levenberg-Marquardt algorithm (b). For more information, see \hyref{Section}{sec:theor.alg}.
+\end{itemize}
 \end{itemize}
 At the right hand side are two graphics windows. The dimensionless correlation functions $G(\tau)$ are plotted against the lag time ($\tau$) on a logarithmic scale. Below, a second window shows the residuals, the actual numerical difference between the correlation data and the model function. Fitting with appropriate models will scatter the residuals symmetrically around zero ($x$-axis). When weighted fitting was performed, the weighted residuals are shown. A good fit will not leave resi [...]
 
@@ -146,19 +147,27 @@ Some examples can be found at GitHub in the \textit{PyCorrFit} repository, e.g.
 \label{sec:menub.filem.loadd}
 \textit{Load data }is the first way to import multiple correlation data sets into a \textit{PyCorrFit} session. The supported file formats can be found in a drop-down list of supported file endings in the pop-up dialog \textit{Open data files}:
 
-
 \begin{tabular}{l l}
  \rule{0pt}{3ex}  (1) All supported files & default \\
- \rule{0pt}{3ex} (2) Confocor3 (*.fcs) & AIM 4.2, ZEN 2010, Zeiss, Germany \\
- \rule{0pt}{3ex} (3) Correlator ALV6000 (*.ASC) & ALV Laser GmbH, Langen, Germany \\
- \rule{0pt}{3ex} (4) Correlator.com (*.SIN) & www.correlator.com, USA \\
+
+ \rule{0pt}{3ex} (2) ALV (*.ASC) & ALV Laser GmbH, Langen, Germany \\
+
+ \rule{0pt}{3ex} (3) Correlator.com (*.SIN) & www.correlator.com, USA \\
+
+ \rule{0pt}{3ex} (4) Zeiss ConfoCor3 (*.fcs) & AIM 4.2, ZEN 2010, Zeiss, Germany \\
+ 
  \rule{0pt}{3ex} (5) Matlab ‘Ries (*.mat) & EMBL Heidelberg, Germany \\
+
  \rule{0pt}{3ex} (6) PyCorrFit (*.csv) & Paul Müller, TU Dresden, Germany \\
- \rule{0pt}{3ex} (7) Zip files (*.zip) & Paul Müller, TU Dresden, Germany \\
+
+ \rule{0pt}{3ex} (7) PyCorrFit session (*.pcfs) & Paul Müller, TU Dresden, Germany \\
+
+
+ \rule{0pt}{3ex} (8) Zip file (*.zip) & Paul Müller, TU Dresden, Germany \\
 \end{tabular}
 \vspace{3ex}
 \newline
-While (2)-(4) are file formats associated with commercial hardware, (5) refers to a MATLAB based FCS evaluation software developed by Jonas Ries in the Schwille lab at TU Dresden, (6) is a text file containing comma-separated values (csv) generated by PyCorrFit via the command \textit{Current Page / Save data}. Zip-files are automatically decompressed and can be imported when matching one of the above mentioned formats. In particular loading of zip files is a possibility to re-import cor [...]
+While (2)-(4) are file formats associated with commercial hardware, (5) refers to a MATLAB based FCS evaluation software developed by Jonas Ries in the Schwille lab at TU Dresden, (6) is a text file containing comma-separated values (csv) generated by PyCorrFit via the command \textit{Current Page / Save data}. Zip-files are automatically decompressed and can be imported when matching one of the above mentioned formats. In particular loading of *.pcfs files (which are actually zip files) [...]
 
 During loading, the user is prompted to assign fit models in the \textit{Choose Models} dialogue window. There, curves are sorted according to channel (for example AC1, AC2, CC12, and CC21, as a typical outcome of a dual-color cross-correlation experiment). For each channel a fit model must be selected from the list (see \hyref{Section}{sec:menub.model}):
 
@@ -166,7 +175,7 @@ If a file format is not yet listed, the correlation data could be converted into
 
 \subsubsection{File / Open session}
 \label{sec:menub.filem.opens}
-This command is the second way to import data into PyCorrFit. In contrast to \textit{Load data}, it opens an entire fitting project, which was previously saved with \textit{PyCorrFit}. Session files are *.zip files named *.fcsfit-session.zip. These files contain all information to restore a session, including comments, model assigned correlation data, and the current state of parameters for each data set (\hyref{Section}{sec:menub.filem.saves}).
+This command is the second way to import data into PyCorrFit. In contrast to \textit{Load data}, it opens an entire fitting project, which was previously saved with \textit{PyCorrFit}. Session files are *.zip files named *.pcfs. These files contain all information to restore a session, including comments, model assigned correlation data, and the current state of parameters for each data set (\hyref{Section}{sec:menub.filem.saves}).
 
 \subsubsection{File / Comment session}
 \label{sec:menub.filem.comme}
@@ -212,7 +221,7 @@ By default, the current page is taken as a reference to perform automated fittin
 
 For fitting, it is crucial to carefully define the starting parameters, whether parameters should be fixed or varied, the range of values which make physically sense, and other options offered within the \textit{Main window}. By executing \textit{Apply to applicable pages}, these settings are transferred to all other pages assigned to the same fit model. Note that this includes the range of lag times (lag time channels) which may have been changed with the \textit{Data range }tool for in [...]
 
-The button \textit{Fit applicable pages} then performs fitting on all pages of the same batch. Alternatively, the user can define an external source of parameters as a reference, i.e. the first page of some \textit{Other session} (*.fcsfit-session.zip). However, this assumes a consistent assignment of model functions.
+The button \textit{Fit applicable pages} then performs fitting on all pages of the same batch. Alternatively, the user can define an external source of parameters as a reference, i.e. the first page of some \textit{Other session} (*.pcfs). However, this assumes a consistent assignment of model functions.
 
 \subsubsection{Tools / Global fitting}
 \label{sec:menub.tools.globa}
@@ -678,9 +687,8 @@ The lateral detection area is a convolution of the point spread function of the
 with a square of side length $a$.
 
 
-\subsection{Non-linear least-squares fit}
+\subsection{Fitting}
 \label{sec:theor.nonle}
-\textit{PyCorrFit} uses the non-linear least-squares fitting capabilities from \texttt{scipy.optimize}. This package utilizes the Levenberg–Marquardt algorithm to minimize the sum of the squares. More information on this topic can be obtained from the online documentation of \texttt{leastsq}\footnote{\url{http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.leastsq.html##scipy.optimize.leastsq}}. 
 One can define a distance $d(G,H)$ between two discrete functions $G$ and $H$ with the discrete domain of definition $\tau_1 \dots \tau_n$ as the sum of squares:
 \begin{equation}
 d(G,H) = \sum_{i=1}^n \left[ G(\tau_i) - H(\tau_i) \right]^2
@@ -692,7 +700,7 @@ The least-squares method minimizes this distance between the model function $G$
 The minimum distance $\chi^2$ is used to characterize the success of a fit. Note, that if the number of fitting parameters $k$ becomes too large, multiple values for $\chi^2$ can be found, depending on the starting values of the $k$ parameters.
 
 
-\subsection{Weighted fitting}
+\subsubsection{Weighted fitting}
 \label{sec:theor.weigh}
 In certain cases, it is useful to implement weights (standard deviation) $\sigma_i$ for the calculation of $\chi^2$. For example, very noisy parts of a correlation curve can falsify the resulting fit. In \textit{PyCorrFit}, weighting is implemented as follows:
 \begin{equation}
@@ -700,5 +708,22 @@ In certain cases, it is useful to implement weights (standard deviation) $\sigma
 \end{equation}
 \textit{PyCorrFit} is able to calculate the weights $\sigma_i$ from the experimental data. The different approaches of this calculation of weights implemented in \textit{PyCorrFit} are explained in \hyref{Section}{sec:intro.graph}.
 
+\subsubsection{Algorithms}
+\label{sec:theor.alg}
+\textit{PyCorrFit} uses the non-linear least-squares fitting capabilities from \texttt{scipy.optimize}. This package contains several algorithms to minimize the sum of the squares. 
+PyCorrFit can utilize several algorithms to perform this minimization. The descriptions of the algorithms listed here are partly copied from the scipy documentation at \url{http://docs.scipy.org/doc/scipy/reference/optimize.html}. 
+\begin{itemize}
+\item The \textbf{BFGS} method uses the quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS) \cite{Nocedal2006} (pp. 136). It uses the first derivatives only. BFGS has proven good performance even for non-smooth optimizations.
+\item The \textbf{Levenberg-Marquardt} algorithm \cite{Levenberg1944} uses the first derivatives and combines the Gauss–Newton algorithm with a trust region approach. It is very robust compared to other algorithms and it is very popular in curve-fitting. \textit{PyCorrFit} uses this algorithm by default. If this algorithm is used, \textit{PyCorrFit} can estimate an error of the fit parameters using the covariance matrix.
+\item The \textbf{Nelder-Mead} method uses the Simplex algorithm \cite{Nelder1965,Wright1996}. This algorithm has been successful in many applications but other algorithms using the first and/or second derivatives information might be preferred for their better performances and robustness in general.
+\item The method \textbf{Polak-Ribiere} uses a nonlinear conjugate gradient algorithm by Polak and Ribiere, a variant of the Fletcher-Reeves method described in \cite{Nocedal2006} pp.
+120-122. Only the first derivatives are used.
+\item The method \textbf{Powell} is a modification of Powell's method \cite{Powell1964, Press} which is a conjugate direction method. It performs sequential one-dimensional minimizations along each vector of the directions set, which is updated at each iteration of the main minimization loop. The function need not be differentiable, and no derivatives are taken.
+\end{itemize}
 
 \input{PyCorrFit_doc_models}
+
+\section{Troubleshooting}
+If you are having problems with PyCorrFit, you might find the solution in the frequently asked questions\footnote{\url{https://github.com/paulmueller/PyCorrFit/wiki/Frequently-Asked-Questions-\%28FAQ\%29}} or on other pages in the \textit{PyCorrFit} 
+wiki\footnote{\url{https://github.com/paulmueller/PyCorrFit/wiki}}.
+There you will also find instructions on how to contact us to file a bug or to request a feature.
diff --git a/doc-src/PyCorrFit_doc_models.tex b/doc-src/PyCorrFit_doc_models.tex
index 70b4bb7..27b4c8c 100644
--- a/doc-src/PyCorrFit_doc_models.tex
+++ b/doc-src/PyCorrFit_doc_models.tex
@@ -181,12 +181,12 @@ The lateral detection area has the same shape as in confocal FCS. Thus, correlat
 
 % 3D diffusion (Gauß/exp)
 \noindent \begin{tabular}{lp{.7\textwidth}}
-Name & \textbf{TIR (Gaussian/Exp.) 3D} \\ 
-ID & \textbf{6013} \\ 
-Descr. &  Three-dimensional free diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction\cite{Starr2001, Hassler2005, Ohsugi2006}. \\ 
+Name & \textbf{TIR (Gaussian/Exp.) T+3D} \\ 
+ID & \textbf{6014} \\ 
+Descr. &  Three-dimensional free diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction, including a triplet component\cite{Starr2001, Hassler2005, Ohsugi2006}. \\ 
 \end{tabular}
 \begin{align}
-G(\tau) = \frac{1}{C}  \frac{ \kappa^2}{ \pi (R_0^2 +4D\tau)}
+G(\tau) = \frac{1}{C}  \frac{ \kappa^2}{ \pi (R_0^2 +4D\tau)} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T}  \right)
  \left( \sqrt{\frac{D \tau}{\pi}} + \frac{1 - 2 D \tau \kappa^2}{2 \kappa}  w\!\left(i \sqrt{D \tau} \kappa\right) \right)
 \end{align} 
 \begin{center}
@@ -195,6 +195,8 @@ $C$ & Particle concentration in confocal volume \\
 $\kappa$ &  Evanescent decay constant ($\kappa = 1/d_\mathrm{eva}$)\\ 
 $R_0$ & Lateral extent of the detection volume \\
 $D$ & Diffusion coefficient  \\
+$T$ &  Fraction of particles in triplet (non-fluorescent) state\\ 
+$\tau_\mathrm{trip}$ &  Characteristic residence time in triplet state \\ 
 \end{tabular}
 \end{center}
 \vspace{2em}
diff --git a/doc-src/README.md b/doc-src/README.md
new file mode 100644
index 0000000..9df4f4c
--- /dev/null
+++ b/doc-src/README.md
@@ -0,0 +1,23 @@
+This folder contains the TeX-source of the 
+[PyCorrFit documentation](https://github.com/paulmueller/PyCorrFit/raw/master/PyCorrFit_doc.pdf).
+
+If, for some reason, you wish to compile it yourself, you will need a 
+working LaTeX distribution.
+
+If you are running a Linux system, make sure that the following packages
+are installed:
+
+    ghostscript  
+    texlive  
+    texlive-base  
+    texlive-bibtex-extra  
+    texlive-latex-extra  
+    texlive-math-extra  
+    texlive-science  
+    
+    
+Apply these commands repeatedly (3 times to be on the safe side) to
+build the documentation:
+    
+    pdflatex -synctex=1 -interaction=nonstopmode PyCorrFit_doc.tex     
+    bibtex PyCorrFit_doc.aux  
diff --git a/sample_sessions/CSFCS_DiO-in-DOPC.fcsfit-session.zip b/sample_sessions/CSFCS_DiO-in-DOPC.pcfs
similarity index 98%
rename from sample_sessions/CSFCS_DiO-in-DOPC.fcsfit-session.zip
rename to sample_sessions/CSFCS_DiO-in-DOPC.pcfs
index 7d6c852..2a7849e 100644
Binary files a/sample_sessions/CSFCS_DiO-in-DOPC.fcsfit-session.zip and b/sample_sessions/CSFCS_DiO-in-DOPC.pcfs differ
diff --git a/sample_sessions/ConfocalFCS_Alexa488_xcorr.fcsfit-session.zip b/sample_sessions/ConfocalFCS_Alexa488_xcorr.pcfs
similarity index 93%
rename from sample_sessions/ConfocalFCS_Alexa488_xcorr.fcsfit-session.zip
rename to sample_sessions/ConfocalFCS_Alexa488_xcorr.pcfs
index 65194b0..bf61e9b 100644
Binary files a/sample_sessions/ConfocalFCS_Alexa488_xcorr.fcsfit-session.zip and b/sample_sessions/ConfocalFCS_Alexa488_xcorr.pcfs differ
diff --git a/src/PyCorrFit.py b/src/PyCorrFit.py
index f77c598..9e951d7 100755
--- a/src/PyCorrFit.py
+++ b/src/PyCorrFit.py
@@ -28,13 +28,16 @@
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
 
-
-import csv
 from distutils.version import LooseVersion
 import sys
-# Import matplotlib a little earlier. This way some problems with saving
-# dialogs that are not made by "WXAgg" are solved.
 
+class Fake(object):
+    """ Fake module.
+    """
+    def __init__(self):
+        self.__version__ = "0.0 unknown"
+        self.version = "0.0 unknown"
+        self.use = lambda x: None
 
 ## On Windows XP I had problems with the unicode Characters.
 # I found this at 
@@ -44,8 +47,13 @@ import platform
 if platform.system() == 'Windows':
     reload(sys)
     sys.setdefaultencoding('utf-8')
-    
-import matplotlib
+
+# Import matplotlib a little earlier. This way some problems with saving
+# dialogs that are not made by "WXAgg" are solved.
+try:
+    import matplotlib
+except ImportError:
+    matplotlib = Fake()
 # We do catch warnings about performing this before matplotlib.backends stuff
 #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets
 import warnings
@@ -72,10 +80,6 @@ except ImportError:
     print "will not work!"
     # We create a fake module sympy with a __version__ property.
     # This way users can run PyCorrFit without having installed sympy.
-    class Fake(object):
-        def __init__(self):
-            self.__version__ = "0.0 unknown"
-            self.version = "0.0 unknown"
     sympy = Fake()
 # We must not import wx here. frontend/gui does that. If we do import wx here,
 # somehow unicode characters will not be displayed correctly on windows.
@@ -115,14 +119,12 @@ print gui.doc.info(version)
 
 ## Check important module versions
 print "\n\nChecking module versions..."
-CheckVersion(csv.__version__, "1.0", "csv")
+CheckVersion(matplotlib.__version__, "1.0.0", "matplotlib")
 CheckVersion(np.__version__, "1.5.1", "NumPy")
+CheckVersion(yaml.__version__, "3.09", "PyYAML")
 CheckVersion(scipy.__version__, "0.8.0", "SciPy")
 CheckVersion(sympy.__version__, "0.7.2", "sympy")
 CheckVersion(gui.wx.__version__, "2.8.10.1", "wxPython")
-CheckVersion(yaml.__version__, "3.09", "PyYAML")
-
-## Command line ?
 
 
 ## Start gui
@@ -132,16 +134,23 @@ frame = gui.MyFrame(None, -1, version)
 # in the arguments.
 sysarg = sys.argv
 for arg in sysarg:
-    if len(arg) >= 18:
+    if len(arg) > 4:
+        if arg[-4:] == "pcfs":
+            print "\nLoading Session "+arg
+            frame.OnOpenSession(sessionfile=arg)
+            break
+    elif len(arg) > 18:
         if arg[-18:] == "fcsfit-session.zip":
             print "\nLoading Session "+arg
             frame.OnOpenSession(sessionfile=arg)
+            break
     elif arg[:6] == "python":
         pass
     elif arg[-12:] == "PyCorrFit.py":
         pass
+    elif arg[-11:] == "__main__.py":
+        pass
     else:
         print "I do not know what to do with this argument: "+arg
-# Now start the app
+    
 app.MainLoop()
-
diff --git a/src/__init__.py b/src/__init__.py
index c525691..e743160 100644
--- a/src/__init__.py
+++ b/src/__init__.py
@@ -34,4 +34,4 @@ import readfiles
 __version__ = doc.__version__
 __author__ = "Paul Mueller"
 __email__ = "paul.mueller at biotec.tu-dresden.de"
-
+__license__ = "GPL v2"
diff --git a/src/doc.py b/src/doc.py
index 140c27e..b45e687 100755
--- a/src/doc.py
+++ b/src/doc.py
@@ -30,8 +30,19 @@
 
 
 import sys
-import csv
-import matplotlib
+
+# This is a fake class for modules not available.
+class Fake(object):
+    def __init__(self):
+        self.__version__ = "N/A"
+        self.version = "N/A"
+        self.use = lambda x: None        
+try:
+    import matplotlib
+except:
+    # Create fake opbject for matplotlib
+    matplotlib = Fake()
+    
 # We do catch warnings about performing this before matplotlib.backends stuff
 #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets
 import warnings
@@ -43,27 +54,14 @@ import os
 import platform
 import scipy
 
-# This is a fake class for modules not available.
-class Fake(object):
-    def __init__(self):
-        self.__version__ = "N/A"
-        self.version = "N/A"
+
 
 try:
     import sympy
 except ImportError:
     print " Warning: module sympy not found!"
     sympy = Fake()
-try:
-    import urllib2
-except ImportError:
-    print " Warning: module urllib not found!"
-    urllib = Fake()
-try:
-    import webbrowser
-except ImportError:
-    print " Warning: module webbrowser not found!"
-    webbrowser = Fake()
+
 import wx
 import yaml
 
@@ -155,7 +153,9 @@ def info(version):
         texta = textlin
     one = "    PyCorrFit version "+version+"\n\n"
     two = "\n\n    Supported file types:"
-    for item in readfiles.Filetypes.keys():
+    keys = readfiles.Filetypes.keys()
+    keys.sort()
+    for item in keys:
         if item.split("|")[0] != readfiles.Allsupfilesstring:
             two = two + "\n     - "+item.split("|")[0]
     lizenz = ""
@@ -184,19 +184,12 @@ def SoftwareUsed():
     """ Return some Information about the software used for this program """
     text = "Python "+sys.version+\
            "\n\nModules:"+\
-           "\n - csv "+csv.__version__+\
            "\n - matplotlib "+matplotlib.__version__+\
            "\n - NumPy "+numpy.__version__+\
-           "\n - os "+\
-           "\n - platform "+platform.__version__+\
+           "\n - PyYAML "+yaml.__version__ +\
            "\n - SciPy "+scipy.__version__+\
            "\n - sympy "+sympy.__version__ +\
-           "\n - sys "+\
-           "\n - tempfile" +\
-           "\n - urllib2 "+ urllib2.__version__ +\
-           "\n - webbrowser"+\
-           "\n - wxPython "+wx.__version__+\
-           "\n - yaml "+yaml.__version__
+           "\n - wxPython "+wx.__version__
     if hasattr(sys, 'frozen'):
         pyinst = "\n\nThis executable has been created using PyInstaller."
         text = text+pyinst
diff --git a/src/edclasses.py b/src/edclasses.py
index 0fb9c0a..34b7bdc 100644
--- a/src/edclasses.py
+++ b/src/edclasses.py
@@ -23,15 +23,25 @@
 
 
 # Matplotlib plotting capabilities
-import matplotlib
+try:
+    import matplotlib
+except ImportError:
+    pass
 # We do catch warnings about performing this before matplotlib.backends stuff
 #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets
 import warnings
 with warnings.catch_warnings():
     warnings.simplefilter("ignore")
-    matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs
+    try:
+        matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs
+    except:
+        pass
 # We will hack this toolbar here
-from matplotlib.backends.backend_wx import NavigationToolbar2Wx 
+try:
+    from matplotlib.backends.backend_wx import NavigationToolbar2Wx 
+except ImportError:
+    pass
+    
 import numpy as np
 import sys
 import traceback
@@ -227,6 +237,8 @@ class MyYesNoAbortDialog(wx.Dialog):
         self.EndModal(wx.ID_YES)
 
 
-
-# Add the save_figure function to the standard class for wx widgets.
-matplotlib.backends.backend_wx.NavigationToolbar2Wx.save = save_figure
+try:
+    # Add the save_figure function to the standard class for wx widgets.
+    matplotlib.backends.backend_wx.NavigationToolbar2Wx.save = save_figure
+except NameError:
+    pass
diff --git a/src/leastsquaresfit.py b/src/fitting.py
similarity index 75%
rename from src/leastsquaresfit.py
rename to src/fitting.py
index 2613bdc..96a0e17 100644
--- a/src/leastsquaresfit.py
+++ b/src/fitting.py
@@ -1,21 +1,10 @@
 # -*- coding: utf-8 -*-
 """ PyCorrFit
 
-    Module leastsquaresfit
+    Module fitting
     Here are the necessary functions for computing a fit with given parameters.
     See included class "Fit" for more information.
 
-    scipy.optimize.leastsq
-    "leastsq" is a wrapper around MINPACK's lmdif and lmder algorithms.
-    Those use the Levenberg-Marquardt algorithm.
-      subroutine lmdif
- 
-      the purpose of lmdif is to minimize the sum of the squares of
-      m nonlinear functions in n variables by a modification of
-      the levenberg-marquardt algorithm. the user must provide a
-      subroutine which calculates the functions. the jacobian is
-      then calculated by a forward-difference approximation.
-
     Copyright (C) 2011-2012  Paul Müller
 
     This program is free software; you can redistribute it and/or modify
@@ -66,6 +55,15 @@ class Fit(object):
                               self.external_deviations has to be set before
                               self.ApplyParameters is called. Cropping with
                               *interval* is performed here.
+        fit_algorithm - The fitting algorithm to be used for minimization
+                        Have a look at the PyCorrFit documentation for more
+                        information.
+                        - "Lev-Mar" Least squares minimization
+                        - "Nelder-Mead" Simplex
+                        - "BFGS" quasi-Newton method of Broyden,
+                                 Fletcher, Goldfarb and Shanno
+                        - "Powell"
+                        - "Polak-Ribiere"
     """
     def __init__(self):
         """ Initial setting of needed variables via the given *fitset* """   
@@ -90,9 +88,9 @@ class Fit(object):
         self.fittype = "None"
         # Chi**2 Value
         self.chi = None
-        # Messages from leastsq
+        # Messages from fit algorithm
         self.mesg = None
-        # Optimal parameters found by leastsq
+        # Optimal parameters found by fit algorithm
         self.parmoptim = None
         self.covar = None # covariance matrix 
         self.parmoptim_error = None # Errors of fit
@@ -107,6 +105,8 @@ class Fit(object):
         # Standard is yes. If there are no weights
         # (self.fittype not set) then this value becomes False
         self.weightedfit=True
+        # Set the standard method for minimization
+        self.fit_algorithm = "Lev-Mar"
         
 
 
@@ -179,9 +179,14 @@ class Fit(object):
                     plotting.savePlotSingle(name, 1*x, 1*y, 1*ys, dirname = ".",
                                             uselatex=self.uselatex)
                 except:
-                    plt.xscale("log")
-                    plt.plot(x,ys, x,y)
-                    plt.show()
+                    try:
+                        plt.xscale("log")
+                        plt.plot(x,ys, x,y)
+                        plt.show()
+                    except ImportError:
+                        # Tell the user to install matplotlib
+                        print "Matplotlib not found!"
+                        
             ## Calculation of variance
             # In some cases, the actual cropping interval from self.startcrop to
             # self.endcrop is chosen, such that the dataweights must be
@@ -294,12 +299,13 @@ class Fit(object):
 
 
     def fit_function(self, parms, x):
-        """ Create the function to be minimized via least squares.
-            The old function *function* has more parameters than we need for
-            the fitting. So we use this function to set only the necessary 
-            parameters. Returns what *function* would have done.
+        """ Create the function to be minimized. The old function
+            `function` has more parameters than we need for the fitting.
+            So we use this function to set only the necessary 
+            parameters. Returns what `function` would have done.
         """
-        # Reorder the needed variables from *spopt.leastsq* for *function*.
+        # We reorder the needed variables to only use these that are
+        # not fixed for minimization
         index = 0
         for i in np.arange(len(self.values)):
             if self.valuestofit[i]:
@@ -320,14 +326,28 @@ class Fit(object):
         return tominimize
 
 
+    def fit_function_scalar(self, parms, x):
+        """
+            Wrapper of `fit_function` for scalar minimization methods.
+            Returns the sum of squares of the input data.
+            (Methods that are not "Lev-Mar")
+        """
+        e = self.fit_function(parms,x)
+        return np.sum(e*e)
+        
+
     def get_chi_squared(self):
-        # Calculate Chi**2
-        degrees_of_freedom = len(self.x) - len(self.parmoptim) - 1
-        return np.sum( (self.fit_function(self.parmoptim, self.x))**2) / \
-                   degrees_of_freedom
+        """
+            Calculate Chi² for the current class.
+        """
+        # Calculate degrees of freedom
+        dof = len(self.x) - len(self.parmoptim) - 1
+        # This is exactly what is minimized by the scalar minimizers
+        chi2 = self.fit_function_scalar(self.parmoptim, self.x)
+        return chi2 / dof
 
 
-    def least_square(self):
+    def minimize(self):
         """ This will minimize *self.fit_function()* using least squares.
             *self.values*: The values with which the function is called.
             *valuestofit*: A list with bool values that indicate which values
@@ -339,16 +359,23 @@ class Fit(object):
             print "No fitting parameters selected."
             self.valuesoptim = 1*self.values
             return
+        # Get algorithm
+        algorithm = Algorithms[self.fit_algorithm][0]
+
         # Begin fitting
-        res = spopt.leastsq(self.fit_function, self.fitparms[:],
+        if self.fit_algorithm == "Lev-Mar":
+            res = algorithm(self.fit_function, self.fitparms[:],
                             args=(self.x), full_output=1)
-        (popt, pcov, infodict, errmsg, ier) = res
-        self.parmoptim = popt
-        if ier not in [1,2,3,4]:
-            print "Optimal parameters not found: " + errmsg
+        else:
+            res = algorithm(self.fit_function_scalar, self.fitparms[:],
+                            args=([self.x]), full_output=1)
+
+        # The optimal parameters
+        self.parmoptim = res[0]
+
         # Now write the optimal parameters to our values:
         index = 0
-        for i in np.arange(len(self.values)):
+        for i in range(len(self.values)):
             if self.valuestofit[i]:
                 self.values[i] = self.parmoptim[index]
                 index = index + 1
@@ -357,14 +384,68 @@ class Fit(object):
         # Write optimal parameters back to this class.
         self.valuesoptim = 1*self.values # This is actually a redundance array
         self.chi = self.get_chi_squared()
-        try:
-            self.covar = pcov * self.chi # The covariance matrix
-        except:
-            print "PyCorrFit Warning: Error estimate not possible, because we"
-            print "          could not calculate covariance matrix. Please try"
-            print "          reducing the number of fitting parameters."
-            self.parmoptim_error = None
+        
+        # Compute error estimates for fit (Only "Lev-Mar")
+        if self.fit_algorithm == "Lev-Mar":
+            # This is the standard way to minimize the data. Therefore,
+            # we are a little bit more verbose.
+            if res[4] not in [1,2,3,4]:
+                print "Optimal parameters not found: " + res[3]
+            try:
+                self.covar = res[1] * self.chi # The covariance matrix
+            except:
+                print "PyCorrFit Warning: Error estimate not possible, because we"
+                print "          could not calculate covariance matrix. Please try"
+                print "          reducing the number of fitting parameters."
+                self.parmoptim_error = None
+            else:
+                # Error estimation of fitted parameters
+                if self.covar is not None:
+                    self.parmoptim_error = np.diag(self.covar)
         else:
-            # Error estimation of fitted parameters
-            if self.covar is not None:
-                self.parmoptim_error = np.diag(self.covar)
+            self.parmoptim_error = None
+
+
+def GetAlgorithmStringList():
+    """
+        Get supported fitting algorithms as strings.
+        Returns two lists (that are key-sorted) for key and string.
+    """
+    A = Algorithms
+    out1 = list()
+    out2 = list()
+    a = list(A.keys())
+    a.sort()
+    for key in a:
+        out1.append(key)
+        out2.append(A[key][1])
+    return out1, out2
+    
+
+# As of version 0.8.3, we support several minimization methods for
+# fitting data to experimental curves.
+# These functions must be callable like scipy.optimize.leastsq. e.g.
+# res = spopt.leastsq(self.fit_function, self.fitparms[:],
+#                     args=(self.x), full_output=1)
+Algorithms = dict()
+
+# the original one is the least squares fit "leastsq"
+Algorithms["Lev-Mar"] = [spopt.leastsq, 
+           "Levenberg-Marquardt"]
+
+# simplex 
+Algorithms["Nelder-Mead"] = [spopt.fmin,
+           "Nelder-Mead (downhill simplex)"]
+
+# quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno
+Algorithms["BFGS"] = [spopt.fmin_bfgs,
+           "BFGS (quasi-Newton)"]
+
+# modified Powell-method
+Algorithms["Powell"] = [spopt.fmin_powell,
+           "modified Powell (conjugate direction)"]
+
+# nonliner conjugate gradient method by Polak and Ribiere
+Algorithms["Polak-Ribiere"] = [spopt.fmin_cg,
+           "Polak-Ribiere (nonlinear conjugate gradient)"]
+
diff --git a/src/frontend.py b/src/frontend.py
index 08399c3..83052be 100644
--- a/src/frontend.py
+++ b/src/frontend.py
@@ -197,7 +197,8 @@ class MyFrame(wx.Frame):
             self.MainIcon = None
 
 
-    def add_fitting_tab(self, event=None, modelid=None, counter=None):
+    def add_fitting_tab(self, event=None, modelid=None, counter=None,
+                        select=False):
         """ This function creates a new page inside the notebook.
             If the function is called from a menu, the modelid is 
             known by the event. If not, the modelid should be specified by 
@@ -212,6 +213,8 @@ class MyFrame(wx.Frame):
         if modelid is None:
             # Get the model id from the menu
             modelid = event.GetId()
+            # Give the new page focus
+            select = True
         if counter is not None:
             # Set the tabcounter right, so the tabs are counted continuously.
             counterint = int(counter.strip().strip(":").strip("#"))
@@ -229,7 +232,13 @@ class MyFrame(wx.Frame):
         Newtab = page.FittingPanel(self, counter, modelid, active_parms,
                                    self.tau)
         #self.Freeze()
-        self.notebook.AddPage(Newtab, counter+model, select=True)
+        self.notebook.AddPage(Newtab, counter+model, select=select)
+        if select:
+            # A hack to have the last page displayed in the tab menu:
+            Npag = self.notebook.GetPageCount()
+            for i in range(int(Npag)):
+                self.notebook.SetSelection(i)
+
         #self.Thaw()
         self.tabcounter = self.tabcounter + 1
         # Enable the "Current" Menu
@@ -243,18 +252,15 @@ class MyFrame(wx.Frame):
         # window is open.
         # Find Tool Statistics
         # Get open tools
-        toolkeys = self.ToolsOpen.keys()
-        for key in toolkeys:
-            tool = self.ToolsOpen[key]
-            try:
-                if tool.MyName=="STATISTICS":
-                    # Call the function properly.
-                    tool.OnPageChanged(Newtab)
-            except:
-                pass
-        #
-        #######
-        #
+        #toolkeys = self.ToolsOpen.keys()
+        #for key in toolkeys:
+        #    tool = self.ToolsOpen[key]
+        #    try:
+        #        if tool.MyName=="STATISTICS":
+        #            # Call the function properly.
+        #            tool.OnPageChanged(Newtab)
+        #    except:
+        #        pass
         return Newtab
 
 
@@ -461,6 +467,20 @@ class MyFrame(wx.Frame):
             # Try to import a selected .txt file
             try:
                 NewModel.GetCode( os.path.join(dirname, filename) )
+            except NameError:
+                # sympy is probably not installed
+                # Warn the user
+                text = ("SymPy not found.\n"+
+                        "In order to import user defined model\n"+
+                        "functions, please install Sympy\n"+
+                        "version 0.7.2 or higher.\nhttp://sympy.org/")
+                if platform.system().lower() == 'linux':
+                    text += ("\nSymPy is included in the package:\n"+
+                             "   'python-sympy'")
+                dlg = wx.MessageDialog(None, text, 'SymPy not found', 
+                                wx.OK | wx.ICON_EXCLAMATION)
+                dlg.ShowModal()
+                return
             except:
                 # The file does not seem to be what it seems to be.
                 info = sys.exc_info()
@@ -470,14 +490,16 @@ class MyFrame(wx.Frame):
                 errstr += str(info[1])+"\n"
                 for tb_item in traceback.format_tb(info[2]):
                     errstr += tb_item
-                wx.MessageDialog(self, errstr, "Error", 
+                dlg = wx.MessageDialog(self, errstr, "Error", 
                     style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP)
+                dlg.ShowModal()
                 del NewModel
                 return
             # Test the code for sympy compatibility.
             # If you write your own parser, this might be easier.
             try:
                 NewModel.TestFunction()
+
             except:
                 # This means that the imported model file could be
                 # contaminated. Ask the user how to proceed.
@@ -505,7 +527,14 @@ class MyFrame(wx.Frame):
             
 
     def OnClearSession(self,e=None,clearmodels=False):
-        """Open a previously saved session. """
+        """
+            Clear the entire session
+        
+            Returns:
+            "abort" if the user did not want to clear the session
+            "clear" if the session has been cleared and the user did
+                    or did not save the session
+        """
         numtabs = self.notebook.GetPageCount()
         # Ask, if user wants to save current session.
         if numtabs > 0:
@@ -521,7 +550,9 @@ class MyFrame(wx.Frame):
             if result == wx.ID_CANCEL:
                 return "abort"      # stop this function - do nothing.
             elif result == wx.ID_YES:
-                self.OnSaveSession()
+                filename = self.OnSaveSession()
+                if filename is None:
+                    return "abort"
             elif result == wx.ID_NO:
                 pass
         # Delete all the pages
@@ -547,6 +578,7 @@ class MyFrame(wx.Frame):
             menu=self.modelmenudict["User"]
             for item in  menu.GetMenuItems():
                 menu.RemoveItem(item)
+        return "clear"
 
 
     def OnCommSession(self,e=None):
@@ -601,6 +633,10 @@ class MyFrame(wx.Frame):
         
 
     def OnExit(self,e=None):
+        """
+            Kindly asks the user if he wants to save the session and
+            then exit the program.
+        """
         numtabs = self.notebook.GetPageCount()
         # Ask, if user wants to save current session.
         if numtabs > 0:
@@ -613,7 +649,11 @@ class MyFrame(wx.Frame):
             if result == wx.ID_CANCEL:
                 return # stop this function - do nothing.
             elif result == wx.ID_YES:
-                self.OnSaveSession()
+                filename = self.OnSaveSession()
+                if filename is None:
+                    # Do not exit. The user pressed abort in the session
+                    # saving dialog.
+                    return
         # Exit the Program
         self.Destroy()
 
@@ -626,24 +666,31 @@ class MyFrame(wx.Frame):
 
 
 
-    def OnFNBPageChanged(self,e=None, Page=None):
+    def OnFNBPageChanged(self, e=None, Page=None, trigger=None):
         """ Called, when 
             - Page focus switches to another Page
             - Page with focus changes significantly:
                 - experimental data is loaded
                 - weighted fit was done
+            - trigger is a string. For more information read the
+              doc strings of the `tools` submodule.
         """
+        
+        if (e is not None and
+            e.GetEventType()==fnb.EVT_FLATNOTEBOOK_PAGE_CHANGED.typeId
+            and trigger is None):
+            trigger = "tab_browse"
         # Get the Page
         if Page is None:
             Page = self.notebook.GetCurrentPage()
         keys = self.ToolsOpen.keys()
         for key in keys:
             # Update the information
-            self.ToolsOpen[key].OnPageChanged(Page)
+            self.ToolsOpen[key].OnPageChanged(Page, trigger=trigger)
         # parameter range selection tool for page.
         if self.RangeSelector is not None:
             try:
-                self.RangeSelector.OnPageChanged(Page)
+                self.RangeSelector.OnPageChanged(Page, trigger=trigger)
             except:
                 pass
         # Bugfix-workaround for mac:
@@ -819,7 +866,13 @@ class MyFrame(wx.Frame):
 
 
     def ImportData(self, Page, dataexp, trace, curvetype="",
-                   filename="", curveid="", run=""):
+                   filename="", curveid="", run="", trigger=None):
+        """
+            Import data into the current page.
+            
+            `trigger` is passed to PlotAll. For more info see the
+            submodule `tools`.
+        """
         CurPage = Page
         # Import traces. Traces are usually put into a list, even if there
         # is only one trace. The reason is, that for cross correlation, we 
@@ -830,11 +883,11 @@ class MyFrame(wx.Frame):
         # doing this in order to keep data types clean.
         if curvetype[0:2] == "CC":
             # For cross correlation, the trace has two components
-            CurPage.IsCrossCorrelation = True
+            CurPage.SetCorrelationType(True, init=True)
             CurPage.tracecc = trace
             CurPage.trace = None
         else:
-            CurPage.IsCrossCorrelation = False
+            CurPage.SetCorrelationType(False, init=True)
             CurPage.tracecc = None
             if trace is not None:
                 CurPage.trace = trace
@@ -848,7 +901,7 @@ class MyFrame(wx.Frame):
         # It might be possible, that we want the channels to be
         # fixed to some interval. This is the case if the 
         # checkbox on the "Channel selection" dialog is checked.
-        self.OnFNBPageChanged()
+        #self.OnFNBPageChanged()
         # Enable Fitting Button
         CurPage.Fit_enable_fitting()
         # Set new tabtitle value and strip leading or trailing
@@ -859,11 +912,11 @@ class MyFrame(wx.Frame):
             title = "{} id{:03d}-{}".format(filename, int(curveid), curvetype)
         CurPage.tabtitle.SetValue(title.strip())
         # Plot everything
-        CurPage.PlotAll()
+        CurPage.PlotAll(trigger=trigger)
         # Call this function to allow the "Channel Selection" window that
         # might be open to update itself.
         # We are aware of the fact, that we just did that
-        self.OnFNBPageChanged()
+        #self.OnFNBPageChanged()
 
 
     def OnLatexCheck(self,e):
@@ -897,7 +950,7 @@ class MyFrame(wx.Frame):
                         "  - latex\n"+
                         "  - dvipng\n"+
                         "  - ghostscript\n"+
-                        "  - texlive-science\n"+
+                        "  - texlive-latex-base\n"+
                         "  - texlive-math-extra\n")
             dlg = wx.MessageDialog(None, text, 'Latex not found', 
                             wx.OK | wx.ICON_EXCLAMATION)
@@ -1109,13 +1162,14 @@ class MyFrame(wx.Frame):
             # Fill Page with data
             self.ImportData(CurPage, Correlation[i], Trace[i],
                             curvetype=Type[i], filename=Filename[i],
-                            curveid=str(Curveid[i]), run=str(Run[i]))
+                            curveid=str(Curveid[i]), run=str(Run[i]),
+                            trigger="page_add_batch")
             # Let the user abort, if he wants to:
             # We want to do this here before an empty page is added
             # to the notebok.
             if dlg.Update(i+1, "Loading pages...")[0] == False:
-                dlg.Destroy()
-                return
+                break
+        self.OnFNBPageChanged(trigger="page_add_finalize")
         # If the user did not select curves but chose a model, destroy
         # the dialog.
         dlg.Destroy()
@@ -1130,16 +1184,13 @@ class MyFrame(wx.Frame):
         # This will also ask, if user wants to save the current session.
         clear = self.OnClearSession(clearmodels=True)
         if clear == "abort":
-            # User pressed abort when he was asked if he wants to save the
-            # session.
+            # User pressed abort when he was asked if he wants to save
+            # the session. Therefore, we cannot open a new session.
             return "abort"
         Infodict, self.dirname, filename = \
          opf.OpenSession(self, self.dirname, sessionfile=sessionfile)
         # Check, if a file has been opened
         if filename is not None:
-            # Reset all Pages. We already gave the user the possibility to
-            # save his session.
-            # self.OnClearSession()
             self.filename = filename
             self.SetTitleFCS(self.filename)
             ## Background traces
@@ -1178,9 +1229,8 @@ class MyFrame(wx.Frame):
                 # the page later.
                 counter = Infodict["Parameters"][i][0]
                 modelid = Infodict["Parameters"][i][1]
-                self.add_fitting_tab(modelid=modelid, counter=counter)
-                # Get New Page, so we can add our stuff.
-                Newtab = self.notebook.GetCurrentPage()
+                Newtab = self.add_fitting_tab(modelid=modelid,
+                                              counter=counter)
                 # Add experimental Data
                 # Import dataexp:
                 number = counter.strip().strip(":").strip("#")
@@ -1205,7 +1255,8 @@ class MyFrame(wx.Frame):
                     for wkey in wkeys:
                         WeightKinds += [wkey]
                     Newtab.Fitbox[1].SetItems(WeightKinds)
-                self.UnpackParameters(Infodict["Parameters"][i], Newtab)
+                self.UnpackParameters(Infodict["Parameters"][i], Newtab,
+                                      init=True)
                 # Supplementary data
                 try:
                     Sups = Infodict["Supplements"][pageid]
@@ -1244,7 +1295,7 @@ class MyFrame(wx.Frame):
                     else:
                         Newtab.tracecc = trace
                 # Plot everything
-                Newtab.PlotAll()
+                Newtab.PlotAll(trigger="page_add_batch")
             # Set Session Comment
             try:
                 self.SessionComment = Infodict["Comments"]["Session"]
@@ -1257,7 +1308,7 @@ class MyFrame(wx.Frame):
             if self.notebook.GetPageCount() > 0:
                 # Enable the "Current" Menu
                 self.EnableToolCurrent(True)
-                self.OnFNBPageChanged()
+                self.OnFNBPageChanged(trigger="page_add_finalize")
             else:
                 # There are no pages in the session.
                 # Disable some menus and close some dialogs
@@ -1295,7 +1346,13 @@ class MyFrame(wx.Frame):
 
 
     def OnSaveSession(self,e=None):
-        """Save a session for later continuation."""
+        """ 
+            Save a session to a session file
+        
+            Returns:
+            - the filename of the session if it was saved
+            - None, if the user canceled the action
+        """
         # Parameters are all in one dictionary:
         Infodict = dict()
         Infodict["Backgrounds"] = self.Background # Background list
@@ -1362,10 +1419,9 @@ class MyFrame(wx.Frame):
         # If no file has been selected, self.filename will be set to 'None'.
         self.dirname, self.filename = opf.SaveSession(self, self.dirname,
           Infodict)
-        #Function_parms, Function_array, Function_trace, self.Background,
-        #Preferences, Comments, ExternalFunctions, Info)
         # Set title of our window
         self.SetTitleFCS(self.filename)
+        return self.filename
 
 
     def OnShell(self, e=None):
@@ -1427,8 +1483,8 @@ class MyFrame(wx.Frame):
         # Additional parameters as of v.0.2.0
         # Splines and model function:
         # Additional parameters as of v.6.4.0
-        #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar,
-        #                fitspin, buttonfit ]
+        #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, 
+        #              fittextvar, fitspin, buttonfit ]
         # Some fits like Spline have a number of knots of the spline
         # that is important for fitting. If there is a number in the
         # Dropdown, save it.
@@ -1441,7 +1497,8 @@ class MyFrame(wx.Frame):
             knots = int(knots)
         weighted = Page.weighted_fittype_id
         weights = Page.weighted_nuvar
-        Parms.append([weighted, weights, knots])
+        algorithm = Page.fit_algorithm
+        Parms.append([weighted, weights, knots, algorithm])
         # Additional parameters as of v.0.2.9
         # Which Background signal is selected?
         # The Background information is in the list *self.Background*.
@@ -1461,11 +1518,14 @@ class MyFrame(wx.Frame):
         return Parms
 
 
-    def UnpackParameters(self, Parms, Page):
+    def UnpackParameters(self, Parms, Page, init=False):
         """ Apply the given parameters to the Page in question.
             This function contains several *len(Parms) >= X* statements.
             These are used for opening sessions that were saved using
             earlier versions of PyCorrFit.
+            The `init` variable can be set if fundamental changes
+            are made (loading data). This e.g. might change the type
+            (Autocorrelation/Cross-Correlation) of the page.
         """
         modelid = Parms[1]
         if Page.modelid != modelid:
@@ -1508,9 +1568,13 @@ class MyFrame(wx.Frame):
             if len(Parms[5]) == 2:
                 [weighted, weights] = Parms[5]
                 knots = None
-            else:
+            elif len(Parms[5]) == 3:
                 # We have knots as of v. 0.6.5
                 [weighted, weights, knots] = Parms[5]
+            else:
+                # We have different fitting algorithms as of v. 0.8.3
+                [weighted, weights, knots, algorithm] = Parms[5]
+                Page.fit_algorithm = algorithm
             if knots is not None:
                 # This is done with apply_paramters_reverse:
                 #       text = Page.Fitbox[1].GetValue()
@@ -1549,7 +1613,7 @@ class MyFrame(wx.Frame):
                 Page.OnAmplitudeCheck("init")
         # Set if Newtab is of type cross-correlation:
         if len(Parms) >= 8:
-            Page.IsCrossCorrelation = Parms[7]
+            Page.SetCorrelationType(Parms[7], init)
         if len(Parms) >= 9:
             # New feature in 0.7.8 includes normalization to a fitting
             # parameter.
diff --git a/src/models/MODEL_TIRF_gaussian_1C.py b/src/models/MODEL_TIRF_gaussian_1C.py
index 3cba3cb..8352f26 100755
--- a/src/models/MODEL_TIRF_gaussian_1C.py
+++ b/src/models/MODEL_TIRF_gaussian_1C.py
@@ -56,9 +56,8 @@ def CF_Gxyz_TIR_gauss(parms, tau):
         [2] d_eva  Evanescent field depth
         [3] C_3D   Particle concentration in the confocal volume
         *tau* - lag time
-
-        Returns: Normalized 3D correlation function for TIRF.
     """
+    # model 6013
     D = parms[0]
     r0 = parms[1]
     deva = parms[2]
@@ -90,6 +89,72 @@ def CF_Gxyz_TIR_gauss(parms, tau):
     # 1 / (Conc * deva * np.pi * r0) * gz * g2D
 
     return 1 / (Neff) * g2D * gz
+    
+
+
+def CF_Gxyz_TIR_gauss_trip(parms, tau):
+    u""" Three-dimensional free diffusion with a Gaussian lateral 
+        detection profile and an exponentially decaying profile
+        in axial direction, including a triplet component.
+
+        x = sqrt(D*τ)*κ
+        κ = 1/d_eva
+        w(i*x) = exp(x²)*erfc(x)
+        gz = κ * [ sqrt(D*τ/π) + (1 - 2*D*τ*κ)/(2*κ) * w(i*x) ]
+        g2D = 1 / [ π (r₀² + 4*D*τ) ]
+        triplet = 1 + T/(1-T)*exp(-τ/τ_trip)
+        G = 1/C_3D * g2D * gz * triplet
+
+
+        *parms* - a list of parameters.
+        Parameters (parms[i]):
+        [0] D      Diffusion coefficient
+        [1] r₀     Lateral extent of the detection volume
+        [2] d_eva  Evanescent field depth
+        [3] C_3D   Particle concentration in the confocal volume
+        [4] τ_trip  Characteristic residence time in triplet state
+        [5] T       Fraction of particles in triplet (non-fluorescent) state
+                    0 <= T < 1
+        *tau* - lag time
+    """
+    # model 6014
+    D = parms[0]
+    r0 = parms[1]
+    deva = parms[2]
+    Conc = parms[3]
+    tautrip=parms[4]
+    T=parms[5]
+
+    # Calculate sigma: width of the gaussian approximation of the PSF
+    Veff = np.pi * r0**2 * deva
+    Neff = Conc * Veff
+
+    taudiff = r0**2/(4*D)
+    # 2D gauss component
+    # G2D = 1/N2D * g2D = 1/(Aeff*Conc.2D) * g2D
+    g2D = 1 / ( (1.+tau/taudiff) )
+
+    # 1d TIR component
+    # Axial correlation    
+    kappa = 1/deva
+    x = np.sqrt(D*tau)*kappa
+    w_ix = wixi(x)
+
+    # Gz = 1/N1D * gz = kappa / Conc.1D * gz
+    gz = kappa * (np.sqrt(D*tau/np.pi) - (2*D*tau*kappa**2 - 1)/(2*kappa) * w_ix)
+
+    ### triplet
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
+
+    # Neff is not the actual particle number. This formula just looks nicer
+    # this way.
+    # What would be easier to get is:
+    # 1 / (Conc * deva * np.pi * r0) * gz * g2D
+
+    return 1 / (Neff) * g2D * gz * triplet
 
 
 
@@ -123,19 +188,60 @@ def MoreInfo_6013(parms, countrate):
     return Info
 
 
+def MoreInfo_6014(parms, countrate):
+    u"""Supplementary variables:
+        Beware that the effective volume is chosen arbitrarily.
+        Correlation function at lag time τ=0:
+        [6] G(τ=0)
+        Effective detection volume:         
+        [7] V_eff  = π * r₀² * d_eva
+        Effective particle concentration:
+        [8] C_3D [nM] = C_3D [1000/µm³] * 10000/6.0221415
+    """
+    D = parms[0]
+    r0 = parms[1]
+    deva = parms[2]
+    Conc = parms[3]
+    Info=list()
+    # Detection area:
+    Veff = np.pi * r0**2 * deva
+    Neff = Conc * Veff
+    # Correlation function at tau = 0
+    G_0 = CF_Gxyz_TIR_gauss(parms, 0)
+    Info.append(["G(0)", G_0])
+    Info.append(["V_eff [al]", Veff])
+    Info.append(["C_3D [nM]", Conc * 10000/6.0221415])
+    if countrate is not None:
+        # CPP
+        cpp = countrate/Neff
+        Info.append(["cpp [kHz]", cpp])
+    return Info
+
+
 # 3D Model TIR gaussian
-m_3dtirsq6013 = [6013, "3D","Simple 3D diffusion w/ TIR", CF_Gxyz_TIR_gauss]
-labels_6013 = ["D [10 µm²/s]",u"r₀ [100 nm]", "d_eva [100 nm]", "C_3D [1000/µm³)"]
-values_6013 = [2.5420, 9.44, 1.0, .03011]
+m_3dtirsq6013 = [6013, "3D","Simple 3D diffusion w/ TIR",
+                 CF_Gxyz_TIR_gauss]
+labels_6013 = [u"D [10 µm²/s]",
+               u"r₀ [100 nm]",
+               u"d_eva [100 nm]",
+               u"C_3D [1000/µm³)"]
+values_6013 = [2.5420,
+               9.44,
+               1.0,
+               0.03011]
 # For user comfort we add values that are human readable.
 # Theese will be used for output that only humans can read.
-labels_human_readable_6013 = ["D [µm²/s]", u"r₀ [nm]", "d_eva [nm]", "C_3D [1/µm³]"]
-values_factor_human_readable_6013 = [10, 100, 100, 1000]
+labels_human_readable_6013 = [u"D [µm²/s]",
+                              u"r₀ [nm]",
+                              u"d_eva [nm]",
+                              u"C_3D [1/µm³]"]
+values_factor_human_readable_6013 = [10, 
+                                     100,
+                                     100,
+                                     1000]
 valuestofit_6013 = [True, False, False, True]
 parms_6013 = [labels_6013, values_6013, valuestofit_6013,
-              labels_human_readable_6013, values_factor_human_readable_6013]
-
-
+          labels_human_readable_6013, values_factor_human_readable_6013]
 
 # Pack the models
 model1 = dict()
@@ -145,4 +251,43 @@ model1["Supplements"] = MoreInfo_6013
 model1["Verification"] = lambda parms: np.abs(parms)
 
 
-Modelarray = [model1]
+# 3D Model TIR gaussian + triplet
+m_3dtirsq6014 = [6014, "T+3D","Simple 3D diffusion + triplet w/ TIR",
+                 CF_Gxyz_TIR_gauss_trip]
+labels_6014 = [u"D [10 µm²/s]",
+               u"r₀ [100 nm]",
+               u"d_eva [100 nm]",
+               u"C_3D [1000/µm³)",
+               u"τ_trip [ms]",
+               u"T"]
+values_6014 = [2.5420,
+               9.44,
+               1.0,
+               0.03011,
+               0.001,
+               0.01]
+labels_human_readable_6014 = [u"D [µm²/s]",
+                              u"r₀ [nm]",
+                              u"d_eva [nm]",
+                              u"C_3D [1/µm³]",
+                              u"τ_trip [µs]",
+                              u"T"]
+values_factor_human_readable_6014 = [10,
+                                     100,
+                                     100,
+                                     1000,
+                                     1000,
+                                     1]
+valuestofit_6014 = [True, False, False, True, False, False]
+parms_6014 = [labels_6014, values_6014, valuestofit_6014,
+          labels_human_readable_6014, values_factor_human_readable_6014]
+
+# Pack the models
+model2 = dict()
+model2["Parameters"] = parms_6014
+model2["Definitions"] = m_3dtirsq6014
+model2["Supplements"] = MoreInfo_6014
+model2["Verification"] = lambda parms: np.abs(parms)
+
+
+Modelarray = [model1, model2]
diff --git a/src/models/MODEL_TIRF_gaussian_3D2D.py b/src/models/MODEL_TIRF_gaussian_3D2D.py
index fa02470..3c31a7d 100755
--- a/src/models/MODEL_TIRF_gaussian_3D2D.py
+++ b/src/models/MODEL_TIRF_gaussian_3D2D.py
@@ -18,7 +18,9 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
-import numpy as np                  # NumPy
+from __future__ import division
+
+import numpy as np
 import scipy.special as sps
 
 
@@ -107,7 +109,10 @@ def CF_Gxyz_3d2dT_gauss(parms, tau):
     particle3D = alpha**2*F * g2D3D * gz
 
     ### triplet
-    triplet = 1 + T/(1-T)*np.exp(-tau/tautrip)
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
 
     ### Norm
     norm = (1-F + alpha*F)**2
diff --git a/src/models/MODEL_TIRF_gaussian_3D3D.py b/src/models/MODEL_TIRF_gaussian_3D3D.py
index c4f990f..0461635 100755
--- a/src/models/MODEL_TIRF_gaussian_3D3D.py
+++ b/src/models/MODEL_TIRF_gaussian_3D3D.py
@@ -18,7 +18,9 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
-import numpy as np                  # NumPy
+from __future__ import division
+
+import numpy as np
 import scipy.special as sps
 
 
@@ -124,7 +126,10 @@ def CF_Gxyz_3D3DT_gauss(parms, tau):
     particle2 = alpha**2*(1-F) * g2D2 * gz2
 
     ### triplet
-    triplet = 1 + T/(1-T)*np.exp(-tau/tautrip)
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
 
     ### Norm
     norm = (F + alpha*(1-F))**2
@@ -230,7 +235,7 @@ values = [
                 25,      # n
                 25.,       # D1
                 0.9,    # D2
-                0.45,     # F1
+                0.5,     # F1
                 9.44,       # r0
                 1.0,       # deva
                 1.0,     # alpha
diff --git a/src/models/MODEL_classic_gaussian_2D.py b/src/models/MODEL_classic_gaussian_2D.py
index 8a7d51d..cab7c77 100755
--- a/src/models/MODEL_classic_gaussian_2D.py
+++ b/src/models/MODEL_classic_gaussian_2D.py
@@ -18,8 +18,9 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
+from __future__ import division
 
-import numpy as np                  # NumPy
+import numpy as np
 
 
 # 2D simple gauss
@@ -87,7 +88,10 @@ def CF_Gxy_T_gauss(parms, tau):
     T = parms[3]
     dc = parms[4]
 
-    triplet = 1 + T/(1-T)*np.exp(-tau/tautrip)
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
 
     BB = 1 / ( (1.+tau/taudiff) )
     G = dc + 1/n * BB * triplet
@@ -253,10 +257,10 @@ values_6031 = [
                 25,      # n
                 5,       # taud1
                 1000,    # taud2
-                0.75,     # F
+                0.5,     # F
                 1.0,     # alpha
-                0.001,       # tautrip
-                0.01,       # T
+                0.001,   # tautrip
+                0.01,    # T
                 0.0      # offset
                 ]        
 # For user comfort we add values that are human readable.
diff --git a/src/models/MODEL_classic_gaussian_3D.py b/src/models/MODEL_classic_gaussian_3D.py
index a09278c..4d2b981 100755
--- a/src/models/MODEL_classic_gaussian_3D.py
+++ b/src/models/MODEL_classic_gaussian_3D.py
@@ -18,7 +18,9 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
-import numpy as np                  # NumPy
+from __future__ import division
+
+import numpy as np
 
 # 3D simple gauss
 def CF_Gxyz_gauss(parms, tau):
@@ -92,10 +94,12 @@ def CF_Gxyz_blink(parms, tau):
     SP = parms[4]
     off = parms[5]
 
+    if tautrip == 0 or T == 0:
+        AA = 1
+    else:
+        AA = 1 + T/(1-T) * np.exp(-tau/tautrip)
 
-    AA = 1. + T/(1.-T) * np.exp(-tau/tautrip)
-
-    BB = 1 / ( (1.+tau/taudiff) * np.sqrt(1.+tau/(SP**2*taudiff)) )
+    BB = 1 / ( (1+tau/taudiff) * np.sqrt(1+tau/(SP**2*taudiff)) )
     G = off + 1/n * AA * BB
     return G
 
@@ -161,9 +165,11 @@ def CF_Gxyz_gauss_3D3DT(parms, tau):
 
     particle1 = F/( (1+tau/taud1) * np.sqrt(1+tau/(taud1*SP**2)))
     particle2 = alpha**2*(1-F)/( (1+tau/taud2) * np.sqrt(1+tau/(taud2*SP**2)))
-    # If the fraction of dark molecules is zero, its ok. Python can also handle
-    # exp(-1/inf).
-    triplet = 1 + T/(1-T)*np.exp(-tau/tautrip)
+    # If the fraction of dark molecules is zero.
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
     # For alpha == 1, *norm* becomes one
     norm = (F + alpha*(1-F))**2
 
@@ -261,7 +267,7 @@ values_6030 = [
                 25,      # n
                 5,       # taud1
                 1000,    # taud2
-                0.75,    # F
+                0.5,     # F
                 5,       # SP
                 1.0,     # alpha
                 0.001,   # tautrip
diff --git a/src/models/MODEL_classic_gaussian_3D2D.py b/src/models/MODEL_classic_gaussian_3D2D.py
index 83b50a4..b50a0d9 100755
--- a/src/models/MODEL_classic_gaussian_3D2D.py
+++ b/src/models/MODEL_classic_gaussian_3D2D.py
@@ -18,7 +18,9 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
-import numpy as np                  # NumPy
+from __future__ import division
+
+import numpy as np
 
 # 3D + 2D + T
 def CF_Gxyz_3d2dT_gauss(parms, tau):
@@ -65,7 +67,10 @@ def CF_Gxyz_3d2dT_gauss(parms, tau):
 
     particle2D = (1-F)/ (1+tau/taud2D) 
     particle3D = alpha**2*F/( (1+tau/taud3D) * np.sqrt(1+tau/(taud3D*SP**2)))
-    triplet = 1 + T/(1-T)*np.exp(-tau/tautrip)
+    if tautrip == 0 or T == 0:
+        triplet = 1
+    else:
+        triplet = 1 + T/(1-T) * np.exp(-tau/tautrip)
     norm = (1-F + alpha*F)**2
     G = 1/n*(particle2D + particle3D)*triplet/norm
 
@@ -131,13 +136,13 @@ labels  = ["n",
                 ]
 values = [ 
                 25,      # n
-                100,       # taud2D
-                0.1,    # taud3D
-                0.45,     # F3D
+                100,     # taud2D
+                0.1,     # taud3D
+                0.5,     # F3D
                 7,       # SP
                 1.0,     # alpha
-                0.001,       # tautrip
-                0.01,       # T
+                0.001,   # tautrip
+                0.01,    # T
                 0.0      # offset
                 ]
 # For user comfort we add values that are human readable.
diff --git a/src/models/__init__.py b/src/models/__init__.py
index 2c3aecd..5be89bf 100644
--- a/src/models/__init__.py
+++ b/src/models/__init__.py
@@ -333,25 +333,35 @@ verification = dict()
 
 
 # 6001 6002 6031
-AppendNewModel(MODEL_classic_gaussian_2D.Modelarray) 
+AppendNewModel(MODEL_classic_gaussian_2D.Modelarray)
+
 # 6011 6012 6030
-AppendNewModel(MODEL_classic_gaussian_3D.Modelarray) 
+AppendNewModel(MODEL_classic_gaussian_3D.Modelarray)
+
 # 6032
-AppendNewModel(MODEL_classic_gaussian_3D2D.Modelarray) 
-# 6013
+AppendNewModel(MODEL_classic_gaussian_3D2D.Modelarray)
+
+# 6013 6014
 AppendNewModel(MODEL_TIRF_gaussian_1C.Modelarray)
+
 # 6033
-AppendNewModel(MODEL_TIRF_gaussian_3D2D.Modelarray) 
+AppendNewModel(MODEL_TIRF_gaussian_3D2D.Modelarray)
+
 # 6034
-AppendNewModel(MODEL_TIRF_gaussian_3D3D.Modelarray) 
+AppendNewModel(MODEL_TIRF_gaussian_3D3D.Modelarray)
+
 # 6000 6010
-AppendNewModel(MODEL_TIRF_1C.Modelarray) 
+AppendNewModel(MODEL_TIRF_1C.Modelarray)
+
 # 6022
-AppendNewModel(MODEL_TIRF_2D2D.Modelarray) 
+AppendNewModel(MODEL_TIRF_2D2D.Modelarray)
+
 # 6020
-AppendNewModel(MODEL_TIRF_3D2D.Modelarray) 
+AppendNewModel(MODEL_TIRF_3D2D.Modelarray)
+
 # 6023
-AppendNewModel(MODEL_TIRF_3D3D.Modelarray) 
+AppendNewModel(MODEL_TIRF_3D3D.Modelarray)
+
 # 6021
 AppendNewModel(MODEL_TIRF_3D2Dkin_Ries.Modelarray) 
 
@@ -365,7 +375,7 @@ modeltypes = dict()
 #modeltypes[u"TIR (□xσ/Exp.)"] = [6000, 6010, 6022, 6020, 6023, 6021]
 
 modeltypes[u"Confocal (Gaussian)"] = [6011, 6030, 6002, 6031, 6032]
-modeltypes[u"TIR (Gaussian/Exp.)"] = [6013, 6034, 6033]
+modeltypes[u"TIR (Gaussian/Exp.)"] = [6014, 6034, 6033]
 modeltypes[u"TIR (□xσ/Exp.)"] = [6010, 6023, 6000, 6022, 6020, 6021]
 modeltypes[u"User"] = list()
 
diff --git a/src/openfile.py b/src/openfile.py
index 96c3b9d..2de0000 100644
--- a/src/openfile.py
+++ b/src/openfile.py
@@ -51,8 +51,10 @@ def ImportParametersYaml(parent, dirname):
     """ Import the parameters from a parameters.yaml file
         from an PyCorrFit session.
     """
+    wc = [".pcfs", ".fcsfit-session.zip"]
+    wcstring = "PyCorrFit session (*.pcfs)|*{};*{}".format(wc[0], wc[1])
     dlg = wx.FileDialog(parent, "Open session file", dirname, "", 
-                                "*.fcsfit-session.zip", wx.OPEN)
+                                 wcstring, wx.OPEN)
     # user cannot do anything until he clicks "OK"
     if dlg.ShowModal() == wx.ID_OK:
         path = dlg.GetPath()            # Workaround since 0.7.5
@@ -88,10 +90,11 @@ def OpenSession(parent, dirname, sessionfile=None):
         "Traces", dict: page numbers, all traces of the pages
     """
     Infodict = dict()
-    fcsfitwildcard = ".fcsfit-session.zip"
+    wc = [".pcfs", ".fcsfit-session.zip"]
+    wcstring = "PyCorrFit session (*.pcfs)|*{};*{}".format(wc[0], wc[1])
     if sessionfile is None:
         dlg = wx.FileDialog(parent, "Open session file", dirname, "", 
-                        "*"+fcsfitwildcard, wx.OPEN)
+                        wcstring, wx.OPEN)
         # user cannot do anything until he clicks "OK"
         if dlg.ShowModal() == wx.ID_OK:
             path = dlg.GetPath()            # Workaround since 0.7.5
@@ -108,7 +111,8 @@ def OpenSession(parent, dirname, sessionfile=None):
     else:
         (dirname, filename) = os.path.split(sessionfile)
         path = sessionfile                  # Workaround since 0.7.5
-        if filename[-19:] != fcsfitwildcard:
+        if (filename[-len(wc[0]):] != wx[0] and
+            filename[-len(wc[1]):] != wx[1]):
             # User specified wrong file
             print "Unknown file extension: "+filename
             # stop this function
@@ -354,15 +358,16 @@ def SaveSession(parent, dirname, Infodict):
         We will also write a Readme.txt
     """
     dlg = wx.FileDialog(parent, "Save session file", dirname, "",
-                     "*.fcsfit-session.zip", wx.SAVE|wx.FD_OVERWRITE_PROMPT)
+                        "PyCorrFit session (*.pcfs)|*.pcfs",
+                        wx.SAVE|wx.FD_OVERWRITE_PROMPT)
     if dlg.ShowModal() == wx.ID_OK:
         path = dlg.GetPath()            # Workaround since 0.7.5
         (dirname, filename) = os.path.split(path)
         #filename = dlg.GetFilename()
         #dirname = dlg.GetDirectory()
         # Sometimes you have multiple endings...
-        if filename.endswith(".fcsfit-session.zip") is not True:
-            filename = filename+".fcsfit-session.zip"
+        if filename.endswith(".pcfs") is not True:
+            filename = filename+".pcfs"
         dlg.Destroy()
         # Change working directory
         returnWD = os.getcwd()
@@ -800,7 +805,9 @@ Parameters.yaml
       - (List of checked parameters (for fitting))
       - [(Min channel selected), (Max channel selected)]
       - [(Weighted fit method (0=None, 1=Spline, 2=Model function)), 
-          (No. of bins from left and right(, (No. of knots (of e.g. spline))]
+         (No. of bins from left and right),
+         (No. of knots (of e.g. spline)),
+         (Type of fitting algorithm (e.g. "Lev-Mar", "Nelder-Mead")]
       - [B1,B2] Background to use (line in backgrounds.csv)
            B2 is always *null* for autocorrelation curves
       - Data type is Cross-correlation?
diff --git a/src/page.py b/src/page.py
index 5bb9e7d..afc215b 100644
--- a/src/page.py
+++ b/src/page.py
@@ -28,11 +28,6 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
-# Use DEMO for contrast-rich screenshots.
-# This enlarges axis text and draws black lines instead of grey ones.
-DEMO = False
-
-
 import wx                               # GUI interface wxPython
 from wx.lib.agw import floatspin        # Float numbers in spin fields
 import wx.lib.plot as plot              # Plotting in wxPython
@@ -41,7 +36,7 @@ import numpy as np                      # NumPy
 import sys                              # System stuff
 
 import edclasses                    # Cool stuf like better floatspin
-import leastsquaresfit as fit       # For fitting
+import fitting as fit       # For fitting
 import models as mdls
 import tools
 
@@ -87,8 +82,9 @@ class FittingPanel(wx.Panel):
         self.resid = None        # Residuals
         self.data4weight = None  # Data used for weight calculation 
         # Fitting:
-        #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar,
-        #                fitspin, buttonfit ]
+        #self.Fitbox = [ fitbox, weightedfitdrop, fittext, fittext2,
+        #                fittextvar, fitspin, buttonfit, textalg,
+        #                self.AlgorithmDropdown]
         # chi squared - is also an indicator, if something had been fitted
         self.FitKnots = 5 # number of knots for spline fit or similiars
         self.chi2 = None
@@ -96,9 +92,10 @@ class FittingPanel(wx.Panel):
         self.weights_used_for_fitting = None # weights used for fitting
         self.weights_used_for_plotting = None # weights used for plotting
         self.weights_plot_fill_area = None # weight area in plot
-        self.weighted_fittype_id = None # integer (drop down item)
+        self.weighted_fittype_id = 0 # integer (drop down item)
         self.weighted_fittype = "Unknown" # type of fit used
-        self.weighted_nuvar = None # bins for std-dev. (left and rigth)
+        self.weighted_nuvar = 3 # bins for std-dev. (left and rigth)
+        self.fit_algorithm ="Lev-Mar" # Least squares min. is standard
         # dictionary for alternative variances from e.g. averaging
         self.external_std_weights = dict()
         # Errors of fit dictionary
@@ -170,11 +167,13 @@ class FittingPanel(wx.Panel):
         self.calculate_corr()
         # Draw the settings section
         self.settings()
+        # Load default values
+        self.apply_parameters_reverse()
         # Upper Plot for plotting of Correlation Function
         self.canvascorr = plot.PlotCanvas(self.spcanvas)
         self.canvascorr.setLogScale((True, False))  
         self.canvascorr.SetEnableZoom(True)
-        self.PlotAll()
+        self.PlotAll(event="init", trigger="tab_init")
         self.canvascorr.SetSize((canvasx, cupsizey))
         # Lower Plot for plotting of the residuals
         self.canvaserr = plot.PlotCanvas(self.spcanvas)
@@ -185,14 +184,6 @@ class FittingPanel(wx.Panel):
                                         cupsizey)
         self.sp.SplitVertically(self.panelsettings, self.spcanvas,
                                 self.sizepanelx)
-        ## Check out the DEMO option and make change the plot:
-        try:
-            if DEMO == True:
-                self.canvascorr.SetFontSizeAxis(16)
-                self.canvaserr.SetFontSizeAxis(16)
-        except:
-            # Don't raise any unnecessary erros
-            pass
         # Bind resizing to resizing function.
         wx.EVT_SIZE(self, self.OnSize)
 
@@ -230,6 +221,10 @@ class FittingPanel(wx.Panel):
             Knots = self.Fitbox[1].GetValue()
             Knots = filter(lambda x: x.isdigit(), Knots)
             self.FitKnots = int(Knots)
+        # Fitting algorithm
+        keys, items = fit.GetAlgorithmStringList()
+        idalg = self.AlgorithmDropdown.GetSelection()
+        self.fit_algorithm = keys[idalg]
         # If parameters have been changed because of the check_parms
         # function, write them back.
         self.apply_parameters_reverse()
@@ -262,6 +257,10 @@ class FittingPanel(wx.Panel):
         List[1] = "Spline ("+str(self.FitKnots)+" knots)"
         self.Fitbox[1].SetItems(List)
         self.Fitbox[1].SetSelection(idf)
+        # Fitting algorithm
+        keys, items = fit.GetAlgorithmStringList()
+        idalg = keys.index(self.fit_algorithm)
+        self.AlgorithmDropdown.SetSelection(idalg)
 
 
     def calculate_corr(self):
@@ -350,12 +349,6 @@ class FittingPanel(wx.Panel):
                 self.startcrop = 0
             else:
                 self.tau = 1*self.taufull[self.startcrop:self.endcrop]
-        ## ## Channel selection
-        ## # Crops the array *self.dataexpfull* from *start* (int) to *end* (int)
-        ## # and assigns the result to *self.dataexp*. If *start* and *end* are 
-        ## # equal (or not given), *self.dataexp* will be equal to 
-        ## # *self.dataexpfull*.
-        ## self.parent.OnFNBPageChanged(e=None, Page=self)
 
 
     def CorrectDataexp(self, dataexp):
@@ -406,11 +399,14 @@ class FittingPanel(wx.Panel):
 
     def Fit_enable_fitting(self):
         """ Enable the fitting button and the weighted fit control"""
-        #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar,
-        #                fitspin, buttonfit ]
+        #self.Fitbox = [ fitbox, weightedfitdrop, fittext, fittext2,
+        #                fittextvar, fitspin, buttonfit, textalg,
+        #                self.AlgorithmDropdown]
         self.Fitbox[0].Enable()
         self.Fitbox[1].Enable()
-        self.Fitbox[-1].Enable()
+        self.Fitbox[6].Enable()
+        self.Fitbox[7].Enable()
+        self.Fitbox[8].Enable()
 
 
     def Fit_create_instance(self, noplots=False):
@@ -469,11 +465,23 @@ class FittingPanel(wx.Panel):
             self.weights_used_for_fitting = Fitting.dataweights
         self.weighted_fittype_id = idf = self.Fitbox[1].GetSelection()
         self.weighted_fittype = self.Fitbox[1].GetItems()[idf]
+        # Set fitting algorithm
+        Fitting.fit_algorithm = self.fit_algorithm
         return Fitting
 
         
-    def Fit_function(self, event=None, noplots=False):
-        """ Call the fit function. """
+    def Fit_function(self, event=None, noplots=False, trigger=None):
+        """ Calls the fit function.
+            
+            `noplots=True` prevents plotting of spline fits
+        
+            `trigger` is passed to page.PlotAll.
+                      If trigger is "fit_batch", then `noplots` is set
+                      to `True`.
+        
+        """
+        if trigger in ["fit_batch"]:
+            noplots = True
         # Make a busy cursor
         wx.BeginBusyCursor()
         # Apply parameters
@@ -483,8 +491,9 @@ class FittingPanel(wx.Panel):
         Fitting = self.Fit_create_instance(noplots)
         # Reset page counter
         self.GlobalParameterShare = list()
+        Fitting.minimize()
         try:
-            Fitting.least_square()
+            Fitting.minimize()
         except ValueError:
             # I sometimes had this on Windows. It is caused by fitting to
             # a .SIN file without selection proper channels first.
@@ -513,7 +522,7 @@ class FittingPanel(wx.Panel):
         # Update spin-control values
         self.apply_parameters_reverse()
         # Plot everthing
-        self.PlotAll()
+        self.PlotAll(trigger=trigger)
         # Return cursor to normal
         wx.EndBusyCursor()
 
@@ -733,8 +742,6 @@ class FittingPanel(wx.Panel):
             # in the tab? We choose 9: AC1-012 plus 2 whitespaces
             text = self.counter + self.tabtitle.GetValue()[-9:]
         self.parent.notebook.SetPageText(pid,text)        
-        #import IPython
-        #IPython.embed()
 
         
     def OnSetRange(self, e):
@@ -765,7 +772,7 @@ class FittingPanel(wx.Panel):
         self.sp.SetSize(size)
 
 
-    def PlotAll(self, event=None):
+    def PlotAll(self, event=None, trigger=None):
         """
         This function plots the whole correlation and residuals canvas.
         We do:
@@ -773,6 +780,15 @@ class FittingPanel(wx.Panel):
         - Background correction
         - Apply Parameters (separate function)
         - Drawing of plots
+        
+        The `event` is usually just an event from buttons or similar
+        wx objects. It can be "init", then some initial plotting is
+        done before the data is handled.
+        
+        The `trigger` is passed to `self.parent.OnFNBPageChanged` so
+        that tools can update their content accordingly. For more
+        information on triggers, have a look at the doctring of the
+        `tools` submodule.
         """
         if event == "init":
             # We use this to have the page plotted at least once before
@@ -803,21 +819,10 @@ class FittingPanel(wx.Panel):
         zerostart = self.tau[0]
         zeroend = self.tau[-1]
         datazero = [[zerostart, 0], [zeroend,0]]
-        ## Check out the DEMO option and make change the plot:
-        try:
-            if DEMO == True:
-                width = 4
-                colexp = "black"
-                colfit = "red"
-            else:
-                width = 1
-                colexp = "grey"
-                colfit = "blue"
-        except:
-            # Don't raise any unnecessary erros
-            width = 1   
-            colexp = "grey"  
-            colfit = "blue"
+        # Set plot colors
+        width = 1   
+        colexp = "grey"  
+        colfit = "blue"
         colweight = "cyan"
         lines = list()
         linezero = plot.PolyLine(datazero, colour='orange', width=width)
@@ -917,7 +922,17 @@ class FittingPanel(wx.Panel):
             PlotCorr = plot.PlotGraphics([linezero, linecorr],
                        xLabel=u'Lag time τ [ms]', yLabel=u'G(τ)')
             self.canvascorr.Draw(PlotCorr)
-        self.parent.OnFNBPageChanged()
+        self.parent.OnFNBPageChanged(trigger=trigger)
+
+
+    def SetCorrelationType(self, iscc, init=False):
+        """
+            The correlation type (AC or CC) of the page is set if data
+            is imported to the page (parent.ImportData).
+            In this case, init is `True`, else `False`.
+        """
+        if init:
+            self.IsCrossCorrelation = iscc
 
 
     def settings(self):
@@ -1043,11 +1058,20 @@ class FittingPanel(wx.Panel):
         fitsizerspin.Add(fittextvar)
         fitsizerspin.Add(fitspin)
         fitsizer.Add(fitsizerspin)
+        # Add algorithm selection
+        textalg = wx.StaticText(self.panelsettings, label="Algorithm")
+        fitsizer.Add(textalg)
+        self.AlgorithmDropdown = wx.ComboBox(self.panelsettings)
+        keys, items = fit.GetAlgorithmStringList()
+        self.AlgorithmDropdown.SetItems(items)
+        self.Bind(wx.EVT_COMBOBOX, self.apply_parameters,
+                  self.AlgorithmDropdown)
+        fitsizer.Add(self.AlgorithmDropdown)
+        self.AlgorithmDropdown.SetMaxSize(weightedfitdrop.GetSize())
         # Add button "Fit"
         buttonfit = wx.Button(self.panelsettings, label="Fit")
         self.Bind(wx.EVT_BUTTON, self.Fit_function, buttonfit)
         fitsizer.Add(buttonfit)
-        
         self.panelsettings.sizer.Add(fitsizer)
         # Squeeze everything into the sizer
         self.panelsettings.SetSizer(self.panelsettings.sizer)
@@ -1055,8 +1079,9 @@ class FittingPanel(wx.Panel):
         self.panelsettings.Layout()
         self.panelsettings.Show()
         # Make all the stuff available for everyone
-        self.Fitbox = [ fitbox, weightedfitdrop, fittext, fittext2, fittextvar,
-                        fitspin, buttonfit ]
+        self.Fitbox = [ fitbox, weightedfitdrop, fittext, fittext2,
+                        fittextvar, fitspin, buttonfit, textalg,
+                        self.AlgorithmDropdown]
         # Disable Fitting since no data has been loaded yet
         for element in self.Fitbox:
             element.Disable()
diff --git a/src/plotting.py b/src/plotting.py
index be9dc1c..79da229 100644
--- a/src/plotting.py
+++ b/src/plotting.py
@@ -87,6 +87,11 @@ def escapechars(string):
 
 def latexmath(string):
     """ Format given parameters to nice latex. """
+    if string == "offset":
+        # prohibit the very often "offset" to be displayed as variables
+        return r"\mathrm{offset}"
+    elif string == "SP":
+        return r"\mathrm{SP}"
     string = codecs.decode(string, "UTF-8")
     unicodechars = dict()
     #unicodechars[codecs.decode("τ", "UTF-8")] = r"\tau"
@@ -157,19 +162,20 @@ def savePlotCorrelation(parent, dirname, Page, uselatex=False,
     labelweights = ur"Weights of fit"
     labels, parms = mdls.GetHumanReadableParms(Page.modelid,
                                                Page.active_parms[1])
-    # Error parameters with nice look
-    errparmsblank = Page.parmoptim_error
-    if errparmsblank is None:
-        errparms = None
-    else:
-        errparms = dict()
-        for key in errparmsblank.keys():
-            newkey, newparm = mdls.GetHumanReadableParameterDict(Page.modelid,
-                                                        key, errparmsblank[key])
-            errparms[newkey] = newparm
-    parmids = np.where(Page.active_parms[2])[0]
-    labels = np.array(labels)[parmids]
-    parms = np.array(parms)[parmids]
+    ## According to issue #54, we remove fitting errors from plots
+    ## Error parameters with nice look
+    #errparmsblank = Page.parmoptim_error
+    #if errparmsblank is None:
+    #    errparms = None
+    #else:
+    #    errparms = dict()
+    #    for key in errparmsblank.keys():
+    #        newkey, newparm = mdls.GetHumanReadableParameterDict(Page.modelid,
+    #                                                    key, errparmsblank[key])
+    #        errparms[newkey] = newparm
+    #parmids = np.where(Page.active_parms[2])[0]
+    #labels = np.array(labels)[parmids]
+    #parms = np.array(parms)[parmids]
     if dataexp is None:
         if tabtitle.strip() == "":
             fitlabel = Page.modelname
@@ -251,26 +257,28 @@ def savePlotCorrelation(parent, dirname, Page, uselatex=False,
         text += r'\begin{split}' # ...but they are all concatenated
         #                          by the interpreter :-)
         for i in np.arange(len(parms)):
-            text += r' '+latexmath(labels[i])+r" &= " + str(parms[i]) +r' \\ '
-        if errparms is not None:
-            keys = errparms.keys()
-            keys.sort()
-            for key in keys:
-                text += r' \Delta '+latexmath(key)+r" &= " + str(errparms[key]) +r' \\ '
+            text += r' {} &= {:.3g} \\'.format(latexmath(labels[i]), parms[i])
+        ## According to issue #54, we remove fitting errors from plots
+        #if errparms is not None:
+        #    keys = errparms.keys()
+        #    keys.sort()
+        #    for key in keys:
+        #        text += r' \Delta '+latexmath(key)+r" &= " + str(errparms[key]) +r' \\ '
         text += r' \end{split} '
         text += r' \] '
     else:
         text = ur""
         for i in np.arange(len(parms)):
-            text += labels[i]+" = "+str(parms[i])+"\n"
-        if errparms is not None:
-            keys = errparms.keys()
-            keys.sort()
-            for key in keys:
-                text += "Err "+key+" = " + str(errparms[key]) +"\n"
-    # Add some more stuff to the text and append data to a .txt file
-    #text = Auswert(parmname, parmoptim, text, savename)
-    plt.legend()
+            text += "{} = {:.3g}\n".format(labels[i], parms[i])
+        ## According to issue #54, we remove fitting errors from plots
+        #if errparms is not None:
+        #    keys = errparms.keys()
+        #    keys.sort()
+        #    for key in keys:
+        #        text += "Err "+key+" = " + str(errparms[key]) +"\n"
+
+
+
     logmax = np.log10(xmax)
     logmin = np.log10(xmin)
     logtext = 0.6*(logmax-logmin)+logmin
@@ -307,6 +315,21 @@ def savePlotCorrelation(parent, dirname, Page, uselatex=False,
     fig.canvas.HACK_fig = fig
     fig.canvas.HACK_Page = Page
     fig.canvas.HACK_append = ""
+    
+
+    # Legend outside of plot
+    # Decrease size of plot to fit legend
+    box = ax.get_position()
+    box2 = ax2.get_position()
+    ax.set_position([box.x0, box.y0 + box.height * 0.2,
+                     box.width, box.height * 0.9])
+    ax2.set_position([box2.x0, box2.y0 + box.height * 0.2,
+                     box2.width, box2.height])
+    
+    ax.legend(loc='upper center', bbox_to_anchor=(0.5, -0.55),
+              prop={'size':9})
+    
+    
     if verbose == True:
         plt.show()
     else:
@@ -348,11 +371,14 @@ def savePlotTrace(parent, dirname, Page, uselatex=False, verbose=False):
     if Page.trace is not None:
         # Set trace
         traces = [Page.trace]
-        labels = [tabtitle]
+        averages = [np.average(Page.trace)]
+        labels = ["{} ({:.2f} kHz)".format(tabtitle, np.average(traces[0][:,1]))]
     elif Page.tracecc is not None:
         # We have some cross-correlation here. Two traces.
         traces = Page.tracecc
-        labels = [tabtitle+" A", tabtitle+" B"]
+        averages = [np.average(traces[0]), np.average(traces[1])]
+        labels = ["{} A ({:.4g} kHz)".format(tabtitle, np.average(traces[0][:,1])),
+                  "{} B ({:.4g} kHz)".format(tabtitle, np.average(traces[1][:,1]))]
     else:
         return
     ## Check if we can use latex for plotting:
@@ -384,16 +410,26 @@ def savePlotTrace(parent, dirname, Page, uselatex=False, verbose=False):
         plt.plot(time, intensity, '-', 
                  label = labels[i],
                  lw=1)
+                 
     plt.ylabel('count rate [kHz]')
     plt.xlabel('time [s]')
-    # Add some more stuff to the text and append data to a .txt file
-    plt.legend()
+    
+    # Legend outside of plot
+    # Decrease size of plot to fit legend
+    box = ax.get_position()
+    ax.set_position([box.x0, box.y0 + box.height * 0.2,
+                     box.width, box.height * 0.9])
+    plt.legend(loc='upper center', 
+               bbox_to_anchor=(0.5, -0.15),
+               prop={'size':9})
+    
     ## Hack
     # We need this for hacking. See edclasses.
     fig.canvas.HACK_parent = parent
     fig.canvas.HACK_fig = fig
     fig.canvas.HACK_Page = Page
     fig.canvas.HACK_append = "_trace"
+
     if verbose == True:
         plt.show()
     else:
diff --git a/src/readfiles/__init__.py b/src/readfiles/__init__.py
index 98517c6..218ee6f 100644
--- a/src/readfiles/__init__.py
+++ b/src/readfiles/__init__.py
@@ -32,7 +32,7 @@ import zipfile
 
 # To add a filetype add it here and in the
 # dictionaries at the end of this file.
-from read_ASC_ALV_6000 import openASC
+from read_ASC_ALV import openASC
 from read_CSV_PyCorrFit import openCSV
 from read_SIN_correlator_com import openSIN
 from read_FCS_Confocor3 import openFCS
@@ -67,7 +67,7 @@ def openAny(dirname, filename):
         if wildcardstring[0] != Allsupfilesstring:
             otherwcs = wildcardstring[1].split(";")
             for string in otherwcs:
-                if string[-3:] == wildcard:
+                if string.strip(" .*") == wildcard:
                     return Filetypes[key](dirname, filename)
     # If we could not find the correct function in Filetypes, try again
     # in BGFiletypes:
@@ -85,7 +85,7 @@ def openAnyBG(dirname, filename):
         if wildcardstring[0] != Allsupfilesstring:
             otherwcs = wildcardstring[1].split(";")
             for string in otherwcs:
-                if string[-3:] == wildcard:
+                if string.strip(" .*") == wildcard:
                     return BGFiletypes[key](dirname, filename)
     # For convenience in openZIP
     return None
@@ -107,8 +107,9 @@ def openZIP(dirname, filename):
     Filelist = list()     # List of filenames corresponding to *Curvelist*
     Trace = list()        # Corresponding traces
     ## First test, if we are opening a session file
-    fcsfitwildcard = ".fcsfit-session.zip"
-    if len(filename)>19 and filename[-19:] == fcsfitwildcard:
+    sessionwc = [".fcsfit-session.zip", ".pcfs"]
+    if ( (len(filename)>19 and filename[-19:] == sessionwc[0]) or
+         (len(filename)> 5 and filename[-5:] == sessionwc[1])     ):
         # Get the yaml parms dump:
         yamlfile = Arc.open("Parameters.yaml")
         # Parms: Fitting and drawing parameters of the correlation curve
@@ -221,16 +222,18 @@ def openZIP(dirname, filename):
 
 
 # The string that is shown when opening all supported files
-Allsupfilesstring = "All supported files"
+# We add an empty space so it is listed first in the dialogs.
+Allsupfilesstring = " All supported files"
 
 # Dictionary with filetypes that we can open
 # The wildcards point to the appropriate functions.
 Filetypes = { "Correlator.com (*.SIN)|*.SIN;*.sin" : openSIN,
-              "Correlator ALV-6000 (*.ASC)|*.ASC" : openASC,
+              "ALV (*.ASC)|*.ASC" : openASC,
               "PyCorrFit (*.csv)|*.csv" : openCSV,
               "Matlab 'Ries (*.mat)|*.mat" : openMAT,
-              "Confocor3 (*.fcs)|*.fcs" : openFCS,
-              "zip files (*.zip)|*.zip" : openZIP
+              "Zeiss ConfoCor3 (*.fcs)|*.fcs" : openFCS,
+              "Zip file (*.zip)|*.zip" : openZIP,
+              "PyCorrFit session (*.pcfs)|*.pcfs" : openZIP
             }
 # For user comfort, add "All supported files" wildcard:
 Filetypes = AddAllWildcard(Filetypes)
@@ -238,9 +241,10 @@ Filetypes = AddAllWildcard(Filetypes)
 
 # Dictionary with filetypes we can open that have intensity traces in them.
 BGFiletypes = { "Correlator.com (*.SIN)|*.SIN;*.sin" : openSIN,
-                "Correlator ALV-6000 (*.ASC)|*.ASC" : openASC,
+                "ALV (*.ASC)|*.ASC" : openASC,
                 "PyCorrFit (*.csv)|*.csv" : openCSV,
-                "Confocor3 (*.fcs)|*.fcs" : openFCS,
-                "zip files (*.zip)|*.zip" : openZIP
+                "Zeiss ConfoCor3 (*.fcs)|*.fcs" : openFCS,
+                "Zip file (*.zip)|*.zip" : openZIP,
+                "PyCorrFit session (*.pcfs)|*.pcfs" : openZIP
               }
 BGFiletypes = AddAllWildcard(BGFiletypes)
diff --git a/src/readfiles/read_ASC_ALV.py b/src/readfiles/read_ASC_ALV.py
new file mode 100755
index 0000000..22dc1ae
--- /dev/null
+++ b/src/readfiles/read_ASC_ALV.py
@@ -0,0 +1,368 @@
+# -*- coding: utf-8 -*-
+""" 
+    PyCorrFit
+    
+    functions in this file: *openASC*
+
+    Copyright (C) 2011-2012  Paul Müller
+
+    This program is free software; you can redistribute it and/or modify
+    it under the terms of the GNU General Public License as published by
+    the Free Software Foundation; either version 2 of the License, or
+    (at your option) any later version.
+
+    This program is distributed in the hope that it will be useful,
+    but WITHOUT ANY WARRANTY; without even the implied warranty of
+    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+    GNU General Public License for more details.
+
+    You should have received a copy of the GNU General Public License 
+    along with this program. If not, see <http://www.gnu.org/licenses/>.
+"""
+from __future__ import division
+
+import os
+import csv
+import numpy as np
+
+
+def openASC(dirname, filename):
+    """ Read data from a .ASC file, created by
+        some ALV-6000 correlator.
+
+            ALV-6000/E-WIN Data
+            Date :	"2/20/2012"
+            ...
+            "Correlation"
+              1.25000E-004	  3.00195E-001
+              2.50000E-004	  1.13065E-001
+              3.75000E-004	  7.60367E-002
+              5.00000E-004	  6.29926E-002
+              6.25000E-004	  5.34678E-002
+              7.50000E-004	  4.11506E-002
+              8.75000E-004	  4.36752E-002
+              1.00000E-003	  4.63146E-002
+              1.12500E-003	  3.78226E-002
+            ...
+              3.35544E+004	 -2.05799E-006
+              3.77487E+004	  4.09032E-006
+              4.19430E+004	  4.26295E-006
+              4.61373E+004	  1.40265E-005
+              5.03316E+004	  1.61766E-005
+              5.45259E+004	  2.19541E-005
+              5.87202E+004	  3.26527E-005
+              6.29145E+004	  2.72920E-005
+
+            "Count Rate"
+               1.17188	      26.77194
+               2.34375	      26.85045
+               3.51563	      27.06382
+               4.68750	      26.97932
+               5.85938	      26.73694
+               7.03125	      27.11332
+               8.20313	      26.81376
+               9.37500	      26.82741
+              10.54688	      26.88801
+              11.71875	      27.09710
+              12.89063	      27.13209
+              14.06250	      27.02200
+              15.23438	      26.95287
+              16.40625	      26.75657
+              17.57813	      26.43056
+            ...
+             294.14063	      27.22597
+             295.31250	      26.40581
+             296.48438	      26.33497
+             297.65625	      25.96457
+             298.82813	      26.71902
+
+        1. We are interested in the "Correlation" section,
+        where the first column denotes tau in ms and the second row the
+        correlation signal. Values are separated by a tabulator "\t" (some " ").
+
+        2. We are also interested in the "Count Rate" section. Here the times
+        are saved as seconds and not ms like above.
+
+        3. There is some kind of mode where the ALV exports five runs at a
+        time and averages them. The sole correlation data is stored in the
+        file, but the trace is only stored as average or something.
+        So I would not recommend this. However, I added support for this.
+        PyCorrFit then only imports the average data.
+         ~ Paul, 2012-02-20
+        Correlation data starts at "Correlation (Multi, Averaged)".
+
+        Returns:
+        [0]:
+         An array with tuples containing two elements:
+         1st: tau in ms
+         2nd: corresponding correlation signal
+        [1]:
+         Intensity trace:
+         1st: time in ms
+         2nd: Trace in kHz
+        [2]:
+         An array with N elements, indicating, how many curves we are opening
+         from the file. Elements can be names and must be convertible to
+         strings.
+    """
+    openfile = open(os.path.join(dirname, filename), 'r')
+    Alldata = openfile.readlines()
+    ## Correlation function
+    # Find out where the correlation function is
+    for i in np.arange(len(Alldata)):
+        if Alldata[i][0:4] == 'Mode':
+            mode = Alldata[i][5:].strip(' ":').strip().strip('"')
+            if mode.lower().count('single'):
+                single = True
+                channel = mode.split(" ")[-1]
+            else:
+                # dual
+                single = False
+            # accc ?
+            if mode.lower().count("cross") == 1:
+                accc = "CC"
+            else:
+                accc = "AC"
+        if Alldata[i][0:12] == '"Correlation':
+            # This tells us if there is only one curve or if there are
+            # multiple curves with an average.
+            if (Alldata[i].strip().lower() == 
+                '"correlation (multi, averaged)"' ):
+                multidata = True
+            else:
+                multidata = False
+        if Alldata[i][0:13] == '"Correlation"':
+            # Start of correlation function
+            StartC = i+1
+        if Alldata[i][0:31] == '"Correlation (Multi, Averaged)"':
+            # Start of AVERAGED correlation function !!!
+            # There are several curves now.
+            StartC = i+2
+        if Alldata[i][0:12] == '"Count Rate"':
+            # End of correlation function
+            EndC = i-2
+            # Start of trace (goes until end of file)
+            StartT = i+1
+    EndT = Alldata.__len__()
+    # Get the header
+    Namedata = Alldata.__getslice__(StartC-1, StartC)
+    ## Define *curvelist*
+    curvelist = csv.reader(Namedata, delimiter='\t').next()
+    if len(curvelist) <= 2:
+        # Then we have just one single correlation curve
+        curvelist = [""]
+    else:
+        # We have a number of correlation curves. We need to specify
+        # names for them. We take these names from the headings.
+        # Lag times not in the list
+        curvelist.remove(curvelist[0])
+        # Last column is empty
+        curvelist.remove(curvelist[-1])
+    ## Correlation function
+    Truedata = Alldata.__getslice__(StartC, EndC)
+    readdata = csv.reader(Truedata, delimiter='\t')
+    data = list()
+    # Add lists to *data* according to the length of *curvelist*
+    for item in curvelist:
+        data.append(list())
+    # Work through the rows in the read data
+    for row in readdata:
+        for i in np.arange(len(curvelist)):
+            data[i].append( (np.float(row[0]), np.float(row[i+1])) )
+    ## Trace
+    # Trace is stored in two columns
+    # 1st column: time [s]
+    # 2nd column: trace [kHz] 
+    # Get the trace
+    Tracedata = Alldata.__getslice__(StartT, EndT)
+    timefactor = 1000 # because we want ms instead of s
+    readtrace = csv.reader(Tracedata, delimiter='\t')
+    trace = list()
+    trace2 = list()
+    # Work through the rows
+    for row in readtrace:
+        # time in ms, countrate
+        trace.append(list())
+        trace[0].append((np.float(row[0])*timefactor,
+                         np.float(row[1])))
+        # Only trace[0] contains the trace!
+        for i in np.arange(len(curvelist)-1):
+            trace.append(list())
+            trace[i+1].append((np.float(row[0])*timefactor, 0))
+        if not single:
+            k = len(curvelist)/2
+            if int(k) != k:
+                print "Problem with ALV data. Single mode not recognized."
+            # presumably dual mode. There is a second trace
+            # time in ms, countrate
+            trace2.append(list())
+            trace2[0].append((np.float(row[0])*timefactor,
+                              np.float(row[2])))
+            # Only trace2[0] contains the trace!
+            for i in np.arange(len(curvelist)-1):
+                trace2.append(list())
+                trace2[i+1].append((np.float(row[0])*timefactor, 0))
+    # return as an array
+    openfile.close()
+
+    # group the resulting curves
+    corrlist = list()
+    tracelist = list()
+    typelist = list()
+    
+        
+    if single:
+        # We only have several runs and one average
+        # split the trace into len(curvelist)-1 equal parts
+        if multidata:
+            nav = 1
+        else:
+            nav = 0
+        splittrace = mysplit(trace[0], len(curvelist)-nav)
+        i = 0
+        for t in range(len(curvelist)):
+            typ = curvelist[t]
+            if typ.lower()[:7] == "average":
+                typelist.append("{} average".format(channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(np.array(trace[0]))
+            else:
+                typelist.append("{} {}".format(accc, channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(splittrace[i])
+                i += 1
+    elif accc == "AC":
+        # Dual mode, autocorrelation
+        # We now have two averages and two different channels.
+        # We now have two traces.
+        # The data is assembled in blocks. That means the first block
+        # contains an average and the data of channel 0 and the second
+        # block contains data and average of channel 1. We can thus
+        # handle the data from 0 to len(curvelist)/2 and from
+        # len(curvelist)/2 to len(curvelist) as two separate data sets.
+        # CHANNEL 0
+        if multidata:
+            nav = 1
+        else:
+            nav = 0
+        channel = "CH0"
+        splittrace = mysplit(trace[0], len(curvelist)/2-nav)
+        i = 0
+        for t in range(int(len(curvelist)/2)):
+            typ = curvelist[t]
+            if typ.lower()[:7] == "average":
+                typelist.append("{} average".format(channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(np.array(trace[0]))
+            else:
+                typelist.append("{} {}".format(accc, channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(splittrace[i])
+                i += 1
+        # CHANNEL 1
+        channel = "CH1"
+        splittrace2 = mysplit(trace2[0], len(curvelist)/2-nav)
+        i = 0
+        for t in range(int(len(curvelist)/2),int(len(curvelist))):
+            typ = curvelist[t]
+            if typ.lower()[:7] == "average":
+                typelist.append("{} average".format(channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(np.array(trace2[0]))
+            else:
+                typelist.append("{} {}".format(accc, channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append(splittrace2[i])
+                i += 1
+    elif accc == "CC":
+        if multidata:
+            nav = 1
+        else:
+            nav = 0
+        # Dual mode, cross-correlation
+        channel = "CC01"
+        splittrace = mysplit(trace[0], len(curvelist)/2-nav)
+        splittrace2 = mysplit(trace2[0], len(curvelist)/2-nav)
+        i = 0
+        for t in range(int(len(curvelist)/2)):
+            typ = curvelist[t]
+            if typ.lower()[:7] == "average":
+                typelist.append("{} average".format(channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append([np.array(trace[0]),
+                                  np.array(trace2[0])  ])
+            else:
+                typelist.append("{} {}".format(accc, channel))
+                corrlist.append(np.array(data[t]))
+                tracelist.append([splittrace[i], splittrace2[i]])
+                i += 1
+        # CHANNEL 1
+        channel = "CC10"
+        i = 0
+        for t in range(int(len(curvelist)/2),int(len(curvelist))):
+            typ = curvelist[t]
+            if typ.lower()[:7] == "average":
+                typelist.append("{} average".format(channel))
+                corrlist.append(np.array(data[t]))
+                # order must be the same as above
+                tracelist.append([np.array(trace[0]),
+                                  np.array(trace2[0])  ])
+            else:
+                typelist.append("{} {}".format(accc, channel))
+                corrlist.append(np.array(data[t]))
+                # order must be the same as above
+                tracelist.append([splittrace[i], splittrace2[i]])
+                i += 1
+    else:
+        print "Could not detect data file format for: {}".format(filename)
+        corrlist = np.array(data)
+        tracelist = np.array(trace)
+        typelist = curvelist
+
+    dictionary = dict()
+    dictionary["Correlation"] = corrlist
+    dictionary["Trace"] = tracelist
+    dictionary["Type"] = typelist
+    filelist = list()
+    for i in curvelist:
+        filelist.append(filename)
+    dictionary["Filename"] = filelist
+    return dictionary
+
+
+def mysplit(a, n):
+    """
+       Split a trace into n equal parts by interpolation.
+       The signal average is preserved, but the signal variance will
+       decrease.
+    """
+    if n == 1:
+        return [np.array(a)]
+    a = np.array(a)
+    N = len(a)
+    lensplit = np.int(np.ceil(N/n))
+
+    # xp is actually rounded -> recalculate
+    xp, step = np.linspace(a[:,0][0], a[:,0][-1], N,
+                           endpoint=True, retstep=True)
+    
+    # let xp start at zero
+    xp -= a[:,0][0]
+    yp = a[:,1]
+    
+    # time frame for each new curve
+    dx = xp[-1]/n
+
+    # perform interpolation of new trace
+    x, newstep = np.linspace(0, xp[-1], lensplit*n,
+                        endpoint=True, retstep=True)
+    # interpolating reduces the variance and possibly changes the avg
+    y = np.interp(x,xp,yp)
+    
+    data = np.zeros((lensplit*n,2))
+    data[:,0] = x + newstep
+    # make sure that the average stays the same:
+    data[:,1] = y - np.average(y) + np.average(yp)
+    
+    return np.split(data,n)
+    
diff --git a/src/readfiles/read_ASC_ALV_6000.py b/src/readfiles/read_ASC_ALV_6000.py
deleted file mode 100755
index e544dc8..0000000
--- a/src/readfiles/read_ASC_ALV_6000.py
+++ /dev/null
@@ -1,177 +0,0 @@
-# -*- coding: utf-8 -*-
-""" 
-    PyCorrFit
-    
-    functions in this file: *openASC*
-
-    Copyright (C) 2011-2012  Paul Müller
-
-    This program is free software; you can redistribute it and/or modify
-    it under the terms of the GNU General Public License as published by
-    the Free Software Foundation; either version 2 of the License, or
-    (at your option) any later version.
-
-    This program is distributed in the hope that it will be useful,
-    but WITHOUT ANY WARRANTY; without even the implied warranty of
-    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
-    GNU General Public License for more details.
-
-    You should have received a copy of the GNU General Public License 
-    along with this program. If not, see <http://www.gnu.org/licenses/>.
-"""
-import os
-import csv
-import numpy as np
-
-
-def openASC(dirname, filename):
-    """ Read data from a .ASC file, created by
-        some ALV-6000 correlator.
-
-            ALV-6000/E-WIN Data
-            Date :	"2/20/2012"
-            ...
-            "Correlation"
-              1.25000E-004	  3.00195E-001
-              2.50000E-004	  1.13065E-001
-              3.75000E-004	  7.60367E-002
-              5.00000E-004	  6.29926E-002
-              6.25000E-004	  5.34678E-002
-              7.50000E-004	  4.11506E-002
-              8.75000E-004	  4.36752E-002
-              1.00000E-003	  4.63146E-002
-              1.12500E-003	  3.78226E-002
-            ...
-              3.35544E+004	 -2.05799E-006
-              3.77487E+004	  4.09032E-006
-              4.19430E+004	  4.26295E-006
-              4.61373E+004	  1.40265E-005
-              5.03316E+004	  1.61766E-005
-              5.45259E+004	  2.19541E-005
-              5.87202E+004	  3.26527E-005
-              6.29145E+004	  2.72920E-005
-
-            "Count Rate"
-               1.17188	      26.77194
-               2.34375	      26.85045
-               3.51563	      27.06382
-               4.68750	      26.97932
-               5.85938	      26.73694
-               7.03125	      27.11332
-               8.20313	      26.81376
-               9.37500	      26.82741
-              10.54688	      26.88801
-              11.71875	      27.09710
-              12.89063	      27.13209
-              14.06250	      27.02200
-              15.23438	      26.95287
-              16.40625	      26.75657
-              17.57813	      26.43056
-            ...
-             294.14063	      27.22597
-             295.31250	      26.40581
-             296.48438	      26.33497
-             297.65625	      25.96457
-             298.82813	      26.71902
-
-        1. We are interested in the "Correlation" section,
-        where the first column denotes tau in ms and the second row the
-        correlation signal. Values are separated by a tabulator "\t" (some " ").
-
-        2. We are also interested in the "Count Rate" section. Here the times
-        are saved as seconds and not ms like above.
-
-        3. There is some kind of mode where the ALV exports five runs at a
-        time and averages them. The sole correlation data is stored in the
-        file, but the trace is only stored as average or something.
-        So I would not recommend this. However, I added support for this.
-        PyCorrFit then only imports the average data.
-         ~ Paul, 2012-02-20
-        Correlation data starts at "Correlation (Multi, Averaged)".
-
-        Returns:
-        [0]:
-         An array with tuples containing two elements:
-         1st: tau in ms
-         2nd: corresponding correlation signal
-        [1]:
-         Intensity trace:
-         1st: time in ms
-         2nd: Trace in kHz
-        [2]:
-         An array with N elements, indicating, how many curves we are opening
-         from the file. Elements can be names and must be convertible to
-         strings.
-    """
-    openfile = open(os.path.join(dirname, filename), 'r')
-    Alldata = openfile.readlines()
-    ## Correlation function
-    # Find out where the correlation function is
-    for i in np.arange(len(Alldata)):
-        if Alldata[i][0:13] == '"Correlation"':
-            # Start of correlation function
-            StartC = i+1
-        if Alldata[i][0:31] == '"Correlation (Multi, Averaged)"':
-            # Start of AVERAGED correlation function !!!
-            # There are several curves now.
-            StartC = i+2
-        if Alldata[i][0:12] == '"Count Rate"':
-            # End of correlation function
-            EndC = i-2
-            # Start of trace (goes until end of file)
-            StartT = i+1
-    EndT = Alldata.__len__()
-    # Get the header
-    Namedata = Alldata.__getslice__(StartC-1, StartC)
-    ## Define *curvelist*
-    curvelist = csv.reader(Namedata, delimiter='\t').next()
-    if len(curvelist) <= 2:
-        # Then we have just one single correlation curve
-        curvelist = [""]
-    else:
-        # We have a number of correlation curves. We need to specify
-        # names for them. We take these names from the headings.
-        # Lag times not in the list
-        curvelist.remove(curvelist[0])
-        # Last column is empty
-        curvelist.remove(curvelist[-1])
-    ## Correlation function
-    Truedata = Alldata.__getslice__(StartC, EndC)
-    readdata = csv.reader(Truedata, delimiter='\t')
-    data = list()
-    # Add lists to *data* according to the length of *curvelist*
-    for item in curvelist:
-        data.append(list())
-    # Work through the rows in the read data
-    for row in readdata:
-        for i in np.arange(len(curvelist)):
-            data[i].append( (np.float(row[0]), np.float(row[i+1])) )
-    ## Trace
-    # Trace is stored in two columns
-    # 1st column: time [s]
-    # 2nd column: trace [kHz] 
-    # Get the trace
-    Tracedata = Alldata.__getslice__(StartT, EndT)
-    timefactor = 1000 # because we want ms instead of s
-    readtrace = csv.reader(Tracedata, delimiter='\t')
-    trace = list()
-    # Add lists to *trace* according to the length of *curvelist*
-    for item in curvelist:
-        trace.append(list())
-    # Work through the rows
-    for row in readtrace:
-        # tau in ms, corr-function
-        trace[0].append((np.float(row[0])*timefactor, np.float(row[1])))
-        for i in np.arange(len(curvelist)-1):
-            trace[i+1].append((np.float(row[0])*timefactor, 0))
-    # return as an array
-    openfile.close()
-    dictionary = dict()
-    dictionary["Correlation"] = np.array(data)
-    dictionary["Trace"] = np.array(trace)
-    dictionary["Type"] = curvelist
-    filelist = list()
-    for i in curvelist:
-        filelist.append(filename)
-    dictionary["Filename"] = filelist
-    return dictionary
diff --git a/src/readfiles/read_FCS_Confocor3.py b/src/readfiles/read_FCS_Confocor3.py
index 1f09319..e4fc8b9 100644
--- a/src/readfiles/read_FCS_Confocor3.py
+++ b/src/readfiles/read_FCS_Confocor3.py
@@ -60,6 +60,10 @@ def openFCS_Multiple(dirname, filename):
         This works with files from the Confocor2, Confocor3 (AIM) and 
         files created from the newer ZEN Software.
     """
+    ### TODO:
+    # match curves with their timestamp
+    # (actimelist and cctimelist)
+    #
     openfile = open(os.path.join(dirname, filename), 'r')
     Alldata = openfile.readlines()
     # Start progressing through the file. i is the line index.
@@ -75,6 +79,10 @@ def openFCS_Multiple(dirname, filename):
     cclist = list()     # All cross-correlation functions
     # The intensity traces
     traces = list()
+    # we use "AcquisitionTime" to match up curves
+    thistime = None
+    actimelist = list()
+    cctimelist = list()
     # The correlation curves
     ac_correlations = list()
     cc_correlations = list()
@@ -84,6 +92,8 @@ def openFCS_Multiple(dirname, filename):
             fcsset = True
             gottrace = False
         if fcsset == True:
+            if Alldata[i].partition("=")[0].strip() == "AcquisitionTime":
+                thistime = Alldata[i].partition("=")[2].strip()
             if Alldata[i].partition("=")[0].strip() == "Channel":
                 # Find out what type of correlation curve we have.
                 # Might be interesting to the user.
@@ -113,6 +123,11 @@ def openFCS_Multiple(dirname, filename):
                     # the next "FcsDataSet"-section.
                     print "Unknown channel configuration in .fcs file: "+FCStype
                     fcsset = False
+                elif FoundType[:2] == "CC":
+                    cctimelist.append(thistime)
+                elif FoundType[:2] == "AC":
+                    actimelist.append(thistime)
+
             if Alldata[i].partition("=")[0].strip() == "CountRateArray":
                 # Start importing the trace. This is a little difficult, since
                 # traces in those files are usually very large. We will bin
@@ -216,6 +231,11 @@ def openFCS_Multiple(dirname, filename):
     #          ac_correlations. Not in cc_correlations,
     #          because cross-correlations are not saved with traces.?
     #
+    # Identifiers:
+    #  actimelist: aquisition times according to aclist
+    #  cctimelist: aquisition times according to cclist
+    #
+    # Correlations:
     #  ac_correlations: AC-correlation data in list.
     #  cc_correlations: CC-correlation data in list.
     # 
@@ -233,7 +253,13 @@ def openFCS_Multiple(dirname, filename):
     curvelist = list()
     tracelist = list()
     corrlist = list()
+    
+    ### TODO:
+    # match curves with their timestamp
+    # (actimelist and cctimelist)
+    
     for i in np.arange(len(ac_correlations)):
+        # Filter curves without correlation (ignore them)
         if ac_correlations[i] is not None:
             curvelist.append(aclist[i])
             tracelist.append(1*traces[i])
@@ -241,12 +267,18 @@ def openFCS_Multiple(dirname, filename):
         else:
             if traces[i] is not None:
                 warnings.warn("File {} curve {} does not contain AC data.".format(filename, i))
+    # Overwrite traces. This way we have equal number of ac correlations
+    # and traces.
+    traces = tracelist
     ## The CC traces are more tricky:
     # Add traces to CC-correlation functions.
     # It seems reasonable, that if number of AC1,AC2 and CC are equal,
     # CC gets the traces accordingly.
-    n_ac1 = aclist.count("AC1")
-    n_ac2 = aclist.count("AC2")
+    # We take the number of ac curves from curvelist instead of aclist,
+    # because aclist may contain curves without ac data (see above).
+    # In that case, the cc traces do most-likely belong to the acs.
+    n_ac1 = curvelist.count("AC1")
+    n_ac2 = curvelist.count("AC2")
     n_cc12 = cclist.count("CC12")
     n_cc21 = cclist.count("CC21")
     if n_ac1==n_ac2==n_cc12==n_cc21>0:
@@ -390,6 +422,11 @@ def openFCS_Single(dirname, filename):
                         corr.append( (np.float(row[0]), np.float(row[1])-1) )
                     corr = np.array(corr)
                 fcscurve = False
+                
+    # Check for correlation at lag-time zero, which lead to a bug (#64)
+    # on mac OSx and potentially affects fitting.
+    if corr[0][0] == 0:
+        corr = corr[1:]
     openfile.close()
     dictionary = dict()
     dictionary["Correlation"] = [corr]
diff --git a/src/tools/__init__.py b/src/tools/__init__.py
index 049a2a2..4b73de7 100644
--- a/src/tools/__init__.py
+++ b/src/tools/__init__.py
@@ -5,6 +5,23 @@
     This file contains useful tools, such as dialog boxes and other stuff,
     that we need in PyCorrFit.
 
+    The tools work with triggers on page updates. Every tool has a 
+    function `OnPageChanged(self, page, trigger=None)` which is called
+    when something in the frontend chages. In order to minimize user
+    stall time, these functions are not executed for a certain list
+    of triggers that is defined in that function. This e.g. dramatically
+    speeds up tools like "Statistics view" when batch fitting.
+    
+    Recognized triggers:
+     tab_init           : initial stuff that is done for a new page
+     tab_browse         : the tab has change and a new page is visible
+     fit_batch          : the page is batch-fitted right now
+     fit_finalize       : a (batch) fitting process is finished
+     parm_batch         : parameters are changed in a batch process
+     parm_finalize      : finished (batch) changing of page parameters
+     page_add_batch     : when many pages are added at the same time
+     page_add_finalize  : finished (batch) adding of pages
+
     Dimensionless representation:
     unit of time        : 1 ms
     unit of inverse time: 10³ /s
diff --git a/src/tools/average.py b/src/tools/average.py
index 8a29494..a87e859 100644
--- a/src/tools/average.py
+++ b/src/tools/average.py
@@ -108,18 +108,30 @@ class Average(wx.Frame):
         self.Destroy()
 
 
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
         # of the notebook changes.
-        idsel = self.WXDropSelMod.GetSelection()
-        self.SetValues()
-        # Set back user selection:
-        self.WXDropSelMod.SetSelection(idsel)
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch",
+                       "parm_finalize", "fit_finalize"]:
+            return
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
             return
+        
+        #idsel = self.WXDropSelMod.GetSelection()
+        self.SetValues()
+        # Set back user selection:
+        #self.WXDropSelMod.SetSelection(idsel)
+
         self.panel.Enable()
         self.Page = page
 
@@ -206,7 +218,11 @@ class Average(wx.Frame):
                     if len(tracetime[j]) != 0:
                         # append to the trace
                         oldend = tracetime[j][-1]
-                        newtracetime = 1.*trace[j][:,0]
+                        # we assume that the first two points in a
+                        # trace are equidistant and we will use their
+                        # difference as an offset
+                        offset = trace[j][:,0][1] - 2*trace[j][:,0][0]
+                        newtracetime = 1.*trace[j][:,0] + offset
                         newtracetime = newtracetime + oldend
                         tracetime[j] = np.append(tracetime[j], newtracetime)
                         del newtracetime
@@ -278,8 +294,8 @@ class Average(wx.Frame):
         # Obtain the model ID from the dropdown selection.
         idsel = self.WXDropSelMod.GetSelection()
         modelid = self.DropdownIndex[idsel]
-        self.parent.add_fitting_tab(modelid = modelid)
-        self.AvgPage = self.parent.notebook.GetCurrentPage()
+        self.AvgPage = self.parent.add_fitting_tab(modelid = modelid,
+                                                   select = True)
         (self.AvgPage.startcrop, self.AvgPage.endcrop) = interval
         self.AvgPage.dataexpfull = average
         self.AvgPage.IsCrossCorrelation = self.IsCrossCorrelation
@@ -287,7 +303,7 @@ class Average(wx.Frame):
             newtrace = newtraces[0]
             if newtrace is not None and len(newtrace) != 0:
                 self.AvgPage.trace = newtrace
-                self.AvgPage.traceavg = newtrace.mean()
+                self.AvgPage.traceavg = newtrace[:,1].mean()
             else:
                 self.AvgPage.trace = None
                 self.AvgPage.traceavg = None
@@ -296,7 +312,6 @@ class Average(wx.Frame):
                 self.AvgPage.tracecc = newtraces
             else:
                 self.AvgPage.tracecc = None
-        self.AvgPage.PlotAll()
         self.AvgPage.Fit_enable_fitting()
         if len(pages) == 1:
             # Use the same title as the first page
@@ -332,6 +347,7 @@ class Average(wx.Frame):
                 WeightKinds += [key]
             self.AvgPage.Fitbox[1].SetItems(WeightKinds)
             self.AvgPage.Fitbox[1].SetSelection(IndexInList)
+        self.AvgPage.PlotAll()
         # Keep the average tool open.
         # self.OnClose()
 
diff --git a/src/tools/background.py b/src/tools/background.py
index 53bd265..907af52 100644
--- a/src/tools/background.py
+++ b/src/tools/background.py
@@ -417,10 +417,19 @@ class BackgroundCorrection(wx.Frame):
             self.parent.notebook.GetPage(i).OnAmplitudeCheck()
 
 
-    def OnPageChanged(self, page=None):
+    def OnPageChanged(self, page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # We do not need the *Range* Commands here yet.
         # We open and close the SelectChannelsFrame every time we
         # import some data.
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         if len(self.parent.Background) == 0:
             self.BGlist = list()
             self.UpdateDropdown()
diff --git a/src/tools/batchcontrol.py b/src/tools/batchcontrol.py
index f97aef9..de940a7 100644
--- a/src/tools/batchcontrol.py
+++ b/src/tools/batchcontrol.py
@@ -117,7 +117,9 @@ class BatchCtrl(wx.Frame):
             OtherPage = self.parent.notebook.GetPage(i)
             if OtherPage.modelid == modelid and OtherPage.dataexp is not None:
                 self.parent.UnpackParameters(Parms, OtherPage)
-                OtherPage.PlotAll()
+                OtherPage.PlotAll(trigger="parm_batch")
+        # Update all other tools fit the finalize trigger.
+        self.parent.OnFNBPageChanged(trigger="parm_finalize")
 
 
     def OnClose(self, event=None):
@@ -144,15 +146,28 @@ class BatchCtrl(wx.Frame):
             if (OtherPage.modelid == modelid and
                 OtherPage.dataexpfull is not None):
                 #Fit
-                OtherPage.Fit_function(noplots=True)
-
-
-    def OnPageChanged(self, Page=None):
+                OtherPage.Fit_function(noplots=True,trigger="fit_batch")
+        # Update all other tools fit the finalize trigger.
+        self.parent.OnFNBPageChanged(trigger="fit_finalize")
+
+
+    def OnPageChanged(self, Page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
             return
         else:
             self.panel.Enable()
+        # Filter triggers
+        if trigger in ["fit_batch", "fit_finalize",
+                       "parm_batch", "parm_finalize"]:
+            return
         # We need to update the list of Pages in self.dropdown
         if self.rbtnhere.Value == True:
             DDlist = list()
@@ -165,7 +180,7 @@ class BatchCtrl(wx.Frame):
 
 
     def OnRadioHere(self, event=None):
-        self.OnPageChanged()
+        self.OnPageChanged(trigger="view")
 
 
     def OnRadioThere(self, event=None):
diff --git a/src/tools/datarange.py b/src/tools/datarange.py
index 57f02d0..1732b2d 100644
--- a/src/tools/datarange.py
+++ b/src/tools/datarange.py
@@ -236,11 +236,13 @@ class SelectChannels(wx.Frame):
         self.Destroy()
 
 
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
         # We do not need the *Range* Commands here yet.
         # We open and close the SelectChannelsFrame every time we
         # import some data.
         #
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         # Check if we have a fixed channel selection
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
diff --git a/src/tools/example.py b/src/tools/example.py
index d2ed007..46f5904 100644
--- a/src/tools/example.py
+++ b/src/tools/example.py
@@ -75,7 +75,14 @@ class Tool(wx.Frame):
         self.Destroy()
 
 
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
diff --git a/src/tools/globalfit.py b/src/tools/globalfit.py
index e40e2ae..3cd5840 100644
--- a/src/tools/globalfit.py
+++ b/src/tools/globalfit.py
@@ -277,11 +277,20 @@ check parameters on each page and start 'Global fit'.
             Page.PlotAll()
 
 
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
         # of the notebook changes.
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
             return
diff --git a/src/tools/info.py b/src/tools/info.py
index 0adbb89..55983bf 100644
--- a/src/tools/info.py
+++ b/src/tools/info.py
@@ -32,6 +32,7 @@
 import wx
 import numpy as np
 
+import fitting
 import models as mdls
 
 # Menu entry name
@@ -164,6 +165,7 @@ class InfoClass(object):
             else:
                 InfoDict["modelsupdoc"] = [func_info.func_doc]
         ## Fitting
+        alg = fitting.Algorithms[Page.fit_algorithm][1]
         weightedfit = Page.weighted_fit_was_performed
         weightedfit_type = Page.weighted_fittype
         fittingbins = Page.weighted_nuvar  # from left and right
@@ -174,13 +176,14 @@ class InfoClass(object):
                 Title.append(["Type AC/CC", "Cross-correlation" ]) 
             else:
                 Title.append(["Type AC/CC", "Autocorrelation" ]) 
-            Fitting.append([ u"\u03c7"+"²", Page.chi2 ])
+            Fitting.append([ u"χ²", Page.chi2 ])
             if Page.weighted_fit_was_performed:
-                Chi2type = "reduced "+u"\u03c7"+"²"
+                Chi2type = u"Weighted sum of squares"
             else:
-                Chi2type = "reduced sum of squares"
-            Fitting.append([ u"\u03c7"+"²-type", Chi2type ])
+                Chi2type = u"Sum of squares"
+            Fitting.append([ u"χ²-type", Chi2type ])
             Fitting.append([ "Weighted fit", weightedfit_type ])
+            Fitting.append([ "Algorithm", alg ])
             if len(Page.GlobalParameterShare) != 0:
                 shared = str(Page.GlobalParameterShare[0])
                 for item in Page.GlobalParameterShare[1:]:
@@ -337,7 +340,16 @@ class ShowInfo(wx.Frame):
             print "Other application has lock on clipboard."
 
 
-    def OnPageChanged(self, page=None):
+    def OnPageChanged(self, page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         # When parent changes
         self.Page = page
         self.Content()
diff --git a/src/tools/overlaycurves.py b/src/tools/overlaycurves.py
index fdf0bdb..999bf5e 100644
--- a/src/tools/overlaycurves.py
+++ b/src/tools/overlaycurves.py
@@ -130,7 +130,17 @@ class Wrapper_Tools(object):
         self.Selector.Destroy()
 
 
-    def OnPageChanged(self, page=None):
+    def OnPageChanged(self, page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch",
+                       "tab_init"]:
+            return
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
@@ -139,16 +149,20 @@ class Wrapper_Tools(object):
             self.Selector.SelectBox.SetItems([])
             self.Selector.sp.Disable()
         else:
+            oldlabels = self.Selector.curvelabels
+            self.Selector.sp.Enable()
             # Sticky behavior cleaned up in 0.7.8
             curvedict, labels = self.GetCurvedict()
             self.Selector.curvedict = curvedict
             self.Selector.labels = labels
             self.Selector.ProcessDict()
             self.labels = labels
-            self.Selector.SelectBox.SetItems(self.Selector.curvelabels)
+            if oldlabels != self.Selector.curvelabels:
+                self.Selector.SelectBox.SetItems(
+                                              self.Selector.curvelabels)
             for i in np.arange(len(self.Selector.curvekeys)):
                 self.Selector.SelectBox.SetSelection(i)
-            self.Selector.OnUpdatePlot()
+            self.Selector.OnUpdatePlot(trigger=trigger)
 
 
     def OnResults(self, keyskeep, keysrem):
@@ -184,7 +198,7 @@ class Wrapper_Tools(object):
         self.OnPageChanged()
 
 
-    def OnSelectionChanged(self, keylist):
+    def OnSelectionChanged(self, keylist, trigger=None):
         if len(keylist) == 0:
             return
         # integer type list with page number
@@ -275,6 +289,7 @@ class UserSelectCurves(wx.Frame):
                                     style=style, choices=self.curvelabels)
         for i in np.arange(len(self.curvekeys)):
             self.SelectBox.SetSelection(i)
+
         # Deselect keys that are not in self.selkeys
         if self.selkeys is not None:
             for i in np.arange(len(self.curvekeys)):
@@ -356,15 +371,21 @@ class UserSelectCurves(wx.Frame):
         self.wrapper.OnResults(keyskeep, keysrem)
 
 
-    def OnUpdatePlot(self, e=None):
+    def OnUpdatePlot(self, e=None, trigger=None):
         """ What should happen when the selection in *self.SelectBox*
             is changed?
+            
             This function will alsy try to call the function
             *self.parent.OnSelectionChanged* and hand over the list of
             currently selected curves. This is an addon for 0.7.8
             where we will control the page selection in the average
             tool.
+            If `trigger` is something that occurs during loading of
+            data, then we will not replot everything.
         """
+        #if e is not None and e.GetEventType() == 10007:
+        #    return
+
         # Get selected curves
         curves = list()
         legends = list()
@@ -387,13 +408,13 @@ class UserSelectCurves(wx.Frame):
         self.canvas.SetEnableLegend(True)
         if len(curves) != 0:
             self.canvas.Draw(plot.PlotGraphics(lines, 
-                         xLabel=u'lag time τ [s]', 
+                         xLabel=u'lag time τ [ms]', 
                          yLabel=u'G(τ)'))
         ## This is an addon for 0.7.8
         keyskeep = list()
         for i in self.SelectBox.GetSelections():
             keyskeep.append(self.curvekeys[i])
         try:
-            self.wrapper.OnSelectionChanged(keyskeep)
+            self.wrapper.OnSelectionChanged(keyskeep, trigger=trigger)
         except:
             pass
diff --git a/src/tools/parmrange.py b/src/tools/parmrange.py
index 122aace..153e3f1 100644
--- a/src/tools/parmrange.py
+++ b/src/tools/parmrange.py
@@ -106,11 +106,20 @@ class RangeSelector(wx.Frame):
         self.Destroy()
 
 
-    def OnPageChanged(self, page=None):
+    def OnPageChanged(self, page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
         # of the notebook changes.
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         self.Page = page
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
diff --git a/src/tools/plotexport.py b/src/tools/plotexport.py
index c92fcbd..f7ecd97 100644
--- a/src/tools/plotexport.py
+++ b/src/tools/plotexport.py
@@ -72,7 +72,14 @@ class Tool(wx.Frame):
         self.Destroy()
 
 
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
diff --git a/src/tools/simulation.py b/src/tools/simulation.py
index ab2cb0a..776e427 100644
--- a/src/tools/simulation.py
+++ b/src/tools/simulation.py
@@ -251,7 +251,15 @@ class Slide(wx.Frame):
         self.OnSize()
 
 
-    def OnPageChanged(self, page=None, init=False):
+    def OnPageChanged(self, page=None, trigger=None, init=False):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+            'init' is used by this tool only.
+        """
         #if init:
         #    # Get the parameters of the current page.
         #    self.SavedParms = self.parent.PackParameters(self.Page)
@@ -259,6 +267,8 @@ class Slide(wx.Frame):
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
         # of the notebook changes.
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         if self.parent.notebook.GetPageCount() == 0:
             self.panel.Disable()
             return
diff --git a/src/tools/statistics.py b/src/tools/statistics.py
index 1f7e937..21157d4 100644
--- a/src/tools/statistics.py
+++ b/src/tools/statistics.py
@@ -28,6 +28,7 @@
     You should have received a copy of the GNU General Public License 
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
+from __future__ import division
 
 import datetime 
 import wx
@@ -71,7 +72,7 @@ class Stat(wx.Frame):
         # Page - the currently active page of the notebook.
         self.Page = self.parent.notebook.GetCurrentPage()
         # Pagenumbers
-        self.PageNumbers = np.arange(self.parent.notebook.GetPageCount())
+        self.PageNumbers =range(1,1+self.parent.notebook.GetPageCount())
         ## Splitter window. left side: checkboxes
         ##                  right side: plot with parameters
         self.sp = wx.SplitterWindow(self, style=wx.SP_3DSASH)
@@ -176,14 +177,17 @@ class Stat(wx.Frame):
             wx.Frame.SetIcon(self, parent.MainIcon)
         self.Show(True)
         self.OnDropDown()
-        self.OnDropDown()
 
-    def GetListOfAllParameters(self, e=None, return_std_checked=False):
+
+    def GetListOfAllParameters(self, e=None, return_std_checked=False,
+                                page=None):
         """ Returns sorted list of parameters.
             If return_std_checked is True, then a second list with
             standart checked parameters is returned.
         """
-        self.InfoClass.CurPage = self.Page
+        if page is None:
+            page = self.Page
+        self.InfoClass.CurPage = page
         # Now that we know our Page, we may change the available
         # parameter options.
         Infodict = self.InfoClass.GetCurInfo()
@@ -208,20 +212,18 @@ class Stat(wx.Frame):
                 headparm = list()
                 bodyparm = list()
                 for parm in Infodict[key]:
-                    #parminlist = False
+                    headparm.append(parm)
                     try:
                         for fitp in Infodict["fitting"]:
                             parmname = parm[0]
                             errname = "Err "+parmname
                             if fitp[0] == errname:
-                                headparm.append(parm)
-                                #parminlist = True
                                 headparm.append(fitp)
                     except:
-                        # Maybe there was not fit...
+                        # There was not fit, the fit with "Lev-Mar"
+                        # was not good, or another fit algorithm was
+                        # used.
                         pass
-                    #if parminlist == False:
-                    headparm.append(parm)
             elif key == "fitting":
                 for fitp in Infodict[key]:
                     # We added the error data before in the parm section
@@ -260,22 +262,24 @@ class Stat(wx.Frame):
             for checkitem in checklist:
                 if item[0].count(checkitem):
                     checked[i] = False
-
         if return_std_checked:
             return Info, checked
         else:
             return Info
 
         
-    def GetListOfPlottableParms(self, e=None, return_values=False):
+    def GetListOfPlottableParms(self, e=None, return_values=False,
+                                page=None):
         """ Returns sorted list of parameters that can be plotted.
             (This means that the values are convertable to floats)
             If return_values is True, then a second list with
             the corresponding values is returned.
         """
+        if page is None:
+            page = self.Page
         if self.parent.notebook.GetPageCount() != 0:
             #Info = self.InfoClass.GetPageInfo(self.Page)
-            Info = self.GetListOfAllParameters()
+            Info = self.GetListOfAllParameters(page=page)
             #keys = Info.keys()
             #keys.sort()
             parmlist = list()
@@ -380,13 +384,16 @@ class Stat(wx.Frame):
 
 
     def OnChooseValues(self, event=None):
+        """
+            Plot the values for the parameter selected by the user.
+        """
         Info, checked = self.GetListOfAllParameters(
                                                 return_std_checked=True)
         #headcounter = 0
         #headlen = len(head)
         # We will sort the checkboxes in more than one column if there
         # are more than *maxitemsincolumn*
-        maxitemsincolumn = np.float(25)
+        maxitemsincolumn = np.float(19)
         Sizernumber = int(np.ceil(len(Info)/maxitemsincolumn))
         self.boxsizerlist = list()
         for i in np.arange(Sizernumber):
@@ -432,6 +439,7 @@ class Stat(wx.Frame):
         if self.parent.notebook.GetPageCount() == 0 or self.Page is None:
             self.canvas.Clear()
             return
+        
         # Get valid pages
         strFull = self.WXTextPages.GetValue()
         try:
@@ -457,8 +465,7 @@ class Stat(wx.Frame):
                     pages.append(Page)
         plotcurve = list()
         for page in pages:
-            self.Page = page
-            pllabel, pldata = self.GetListOfPlottableParms(
+            pllabel, pldata = self.GetListOfPlottableParms(page=page,
                                                      return_values=True)
             # Get the labels and make a plot of the parameters
             if len(pllabel)-1 >= DDselid and pllabel[DDselid] == label:
@@ -475,54 +482,72 @@ class Stat(wx.Frame):
                         plotcurve.append([x,y])
         # Prepare plotting
         self.canvas.Clear()
-        linesig = plot.PolyMarker(plotcurve, size=1.5,
-                              fillstyle=wx.TRANSPARENT, marker='circle')
+        linesig = plot.PolyMarker(plotcurve, size=1.5, marker='circle')
         plotlist = [linesig]
         # average line
-
         try:
             avg = np.average(np.array(plotcurve)[:,1])
-            maxpage =  np.max(np.array(plotcurve)[:,0])
+            maxpage =  int(np.max(np.array(plotcurve)[:,0]) +1)
+            minpage =  int(np.min(np.array(plotcurve)[:,0]) -1)
         except:
+            minpage = 0
             maxpage = 0
             self.WXavg.SetValue("-")
             self.WXsd.SetValue("-")
         else:
             # Plot data
             plotavg = [[0.5, avg], [maxpage+.5, avg]]
-            lineclear = plot.PolyLine(plotavg, colour="black",
-                                      style= wx.SHORT_DASH)
+            #lineclear = plot.PolyLine(plotavg, colour="black")
+            lineclear = plot.PolyMarker(plotavg, colour="black")
             plotlist.append(lineclear)
             # Update Text control
             self.WXavg.SetValue(str(avg))
             self.WXsd.SetValue(str(np.std(np.array(plotcurve)[:,1])))
-            
         # Draw
-        self.canvas.Draw(plot.PlotGraphics(plotlist, 
-                             xLabel='page number', 
-                             yLabel=label))
-
+        # This causes a memory leak after this function has been 
+        # called several times with the same large data set.
+        # This goes away if only linesig OR lineclear are plotted.
+        #
+        #graphics = plot.PlotGraphics(plotlist, 
+        #                             xLabel='page number', 
+        #                             yLabel=label)
+        graphics = plot.PlotGraphics([linesig], 
+                                     xLabel='page number', 
+                                     yLabel=label)
+        
         # Correctly set x-axis
         minticks = 2
-        self.canvas.SetXSpec(max(maxpage, minticks))
-        # Zoom out such that we can see the end of all curves
-        try:
-            # Something sometimes goes wrong here?
-            xcenter = np.average(np.array(plotcurve)[:,0])
-            ycenter = np.average(np.array(plotcurve)[:,1])
-            scale = (maxpage+2.)/maxpage
-            self.canvas.Zoom((xcenter,ycenter), (scale, scale))
-        except:
-            pass
-        # Redraw result
-        self.canvas.Redraw()
-                         
+        self.canvas.SetXSpec(max(maxpage-minpage, minticks))
+        self.canvas.Draw(graphics, xAxis=(minpage,maxpage))
+
         
-    def OnPageChanged(self, page):
+    def OnPageChanged(self, page, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
         # When parent changes
         # This is a necessary function for PyCorrFit.
         # This is stuff that should be done when the active page
         # of the notebook changes.
+        # filter unwanted triggers to improve speed
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
+        elif trigger in ["tab_init"] and page is not None:
+            # Check if we have to replot for a new model
+            if self.Page.modelid == page.modelid:
+                return
+        if (trigger in ["page_add_finalize"] and 
+            self.WXTextPages.GetValue() == "1"):
+            # We probably imported data with statistics window open
+            self.PageNumbers = range(1,
+                                  1+self.parent.notebook.GetPageCount())
+            setstring = misc.parsePagenum2String(self.PageNumbers)
+            self.WXTextPages.SetValue(setstring)
+            
         #
         # Prevent this function to be run twice at once:
         #
diff --git a/src/tools/trace.py b/src/tools/trace.py
index 8eb5a4e..045c26a 100644
--- a/src/tools/trace.py
+++ b/src/tools/trace.py
@@ -28,6 +28,7 @@
     along with this program. If not, see <http://www.gnu.org/licenses/>.
 """
 
+import numpy as np
 
 import wx
 import wx.lib.plot as plot    
@@ -81,6 +82,8 @@ class ShowTrace(wx.Frame):
                                  width=1)
             lines = [line]
             self.canvas.SetEnableLegend(False)
+            xmax = np.max(self.trace[:,0])
+            xmin = np.min(self.trace[:,0])
         elif self.Page.tracecc is not None:
             # This means that we have two (CC) traces to plot
             self.tracea = 1*self.Page.tracecc[0]
@@ -93,16 +96,29 @@ class ShowTrace(wx.Frame):
                                   colour='red', width=1)
             lines = [linea, lineb]
             self.canvas.SetEnableLegend(True)
+            xmax = max(np.max(self.tracea[:,0]), np.max(self.traceb[:,0]))
+            xmin = min(np.min(self.tracea[:,0]), np.min(self.traceb[:,0]))
         else: 
             self.canvas.Clear()
             return
         # Plot lines
+        
         self.canvas.Draw(plot.PlotGraphics(lines, 
                                            xLabel='time [s]', 
-                                           yLabel='count rate [kHz]'))
-
-
-    def OnPageChanged(self, page=None):
+                                           yLabel='count rate [kHz]'),
+                                           xAxis=(xmin,xmax))
+
+
+    def OnPageChanged(self, page=None, trigger=None):
+        """
+            This function is called, when something in the panel
+            changes. The variable `trigger` is used to prevent this
+            function from being executed to save stall time of the user.
+            Forr a list of possible triggers, see the doc string of
+            `tools`.
+        """
+        if trigger in ["parm_batch", "fit_batch", "page_add_batch"]:
+            return
         self.Page = page
         # When parent changes
         if self.parent.notebook.GetPageCount() == 0:
diff --git a/src/usermodel.py b/src/usermodel.py
index b9b5924..04b8f3d 100644
--- a/src/usermodel.py
+++ b/src/usermodel.py
@@ -102,7 +102,8 @@ class CorrFunc(object):
 
 
     def TestFunction(self):
-        """ Test the function for parsibility with the given parameters. """
+        """ Test the function for parsibility with the given parameters.
+        """
         vardict = dict()
         for i in np.arange(len(self.variables)):
             vardict[self.variables[i]] = sympify(float(self.values[i]))

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-med/pycorrfit.git



More information about the debian-med-commit mailing list