the source language is English and the target language is Malayalam. In some
...... can either be applied directly to a dataset or called from your own Java code.
ENGLISH TO MALAYALAM TRANSLITERATION A PROJECT REPORT
Submitted by SUMAJA SASIDHARAN
(CB207CN020)
in partial fulfillment for the award of the degree of MASTER OF TECHNOLOGY IN COMPUTATIONAL ENGINEERING AND NETWORKING
AMRITA SCHOOL OF ENGINEERING, COIMBATORE
AMRITA VISWAVIDHYAPEETHAM COIMBATORE – 641 105
NOVEMBER 2008
AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING, COIMBATORE 641105
BONAFIDE CERTIFICATE This is to certify that the mini-project report entitled “ENGLISH TO MALAYALAM TRANSLITERATION”,
submitted
by
“SUMAJA
SASIDHARAN”
(Reg
No:
CB207CN020) in partial fulfillment of the requirements for the award of the degree of Master of Technology in COMPUTATIONAL ENGINEERING AND NETWORKING is a bonafide record of the work carried out under my guidance and supervision at Amrita School of Engineering.
Project Guide Dr. K.P. SOMAN Head of the Department Computational Engineering and Networking
Head of the Department Dr. K.P. SOMAN
AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING, COIMBATORE, 641105 DEPARTMENT OF COMPUTATIONAL ENGINEERING & NETWORKING
DECLARATION
I, SUMAJA SASIDHARAN (Reg No CB207CN020) hereby declare that this project report, entitled “ENGLISH TO MALAYALAM TRANSLITERATION ”, is a record of the original work done by me under the guidance of Dr.K.P.SOMAN, H.O.D, CEN, Amrita School of Engineering, Coimbatore and that this work has not formed the basis for the award of any degree / diploma / associateship / fellowship or a similar award, to any candidate in any University, to the best of my knowledge.
SUMAJA SASIDHARAN Place: Date:
COUNTERSIGNED Dr K.P SOMAN Head of the Department Computational Engineering and Networking
ACKNOWLEDGEMENT Let the Almighty Lord, be praised for his compassion, whose ample grace has helped us in the successful completion of my project.
I would like to take the opportunity to extend my most sincere gratitude to all those who provided their assistance and co-operation during minor project work on “ENGLISH TO MALAYALAM TRANSLITERATION”.
I express my deep sense of gratitude to Dr.K.P SOMAN, Head of Department, Computational Engineering and Networking for his timely advice and constant encouragement and guidance throughout the progress of the project. His valuable thoughts and suggestions were critical throughout all the stages of this work.
I express my sincere gratitude to , Asst Professor, Dept of Information Technology my project guide for her continuous support and encouragement
I would also like to express my profound gratitude to Mr. Ajith, for his active discussions with me about the topic and insightful comments and constructive suggestions to improve the quality of this project work.
I express my special thanks to all my friends for sharing so many wonderful moments.
Last but not the least we express our thanks to all staff members of our department, my parents and friends who always stood with me with their valuable suggestions and help.
Abstract Transliteration is the mapping of a word or text written in one writing system into another writing system. Transliteration maps the letters of the source language to the letters in the target language for a specific pair of source and target language. Transliteration must preserve sound. Transliteration can be used for encryption also. Here the source language is English and the target language is Malayalam. In some cases the letters in the source script may not match exactly with the target language. Transliteration usually defines some conventions for dealing with that. The source string is segmented in to transliteration units and related with the target language units. Thus transliteration problem can be viewed as a sequence labeling problem. Here the classification is done using Support Vector Machine (SVM) and WEKA. In WEKA each class is tested using different classifiers and compared.
TABLE OF CONTENTS Acknowledgement
iv
Abstract
v
Chapter 1 TRANSLITERATION
1
1.1 Introduction
1
1.2 History
2
1.3 Transliteration Schemes
3
1.4 Importance of Transliteration
4
Chapter 2 LITERATURE SURVEY
5
2.1History of Malayalam Language
5
2.2 Writing System
5
2.3 Uses of Transliteration
6
2.4 Vagaries of Transliteration
7
2.5 Challenges in English to Malayalam Transliteration
8
Chapter 3 SEQUENCE LABELING APPROACH
10
3.1 Transliteration as a sequence labeling problem
10
3.2 Preprocessing Phase
11
Chapter 4 SUPPORT VECTOR MACHINE
14
4.1 Introduction
14
4.2 Formalization
16
4.3 Primal Form
18
4.4 Dual Form
19
4.5 Soft Margin
19
4.6 SVM for Classification
24
4.7 Applications of SVM
25
4.8 Strength and Weakness of SVM
26
4.9 SVM Tool
26
Chapter 5 WEKA
29
5.1 Introduction
29
5.2 Features
29
5.3 Classifiers in WEKA
31
Chapter 6 IMPLEMENTATION
32
6.1 Training using SVM
32
6.2 Training using WEKA
75
Chapter 7 CONCLUSIONS
84
REFERENCES
85
CHAPTER 1 TRANSLITERATION 1.1 Introduction Transliteration is the practice of transcribing a word or text written in one writing system into another writing system or system of rules. From a linguistic point of view, transliteration is a mapping from one system of writing into another, word by word, or ideally letter by letter. Transliteration attempts to be exact, so that an informed reader should be able to reconstruct the original spelling of unknown transliterated words. To achieve this objective, transliteration may define complex conventions for dealing with letters in a source script which do not correspond with letters in a goal script. Transliteration refers to the process by which one reads and pronounces the words and sentences of one language using the letters and special symbols of another language. The primary aim of transliteration is to provide an alternate means of reading text using a different script. Transliteration is meant to preserve the sounds. Thus transliteration is meant to preserve the sounds of the syllables in words. In practice, the same word may be written differently in different scripts due to the local conventions employed for pronouncing the aksharas. Transliteration is opposed to transcription, which specifically maps the sounds of one language to the best matching script of another language. Still, most systems of transliteration map the letters of the source script to letters pronounced similarly in the goal script, for some specific pair of source and goal language. If the relations between letters and sounds are similar in both languages, a transliteration may be (almost) the same as a transcription. There are also some mixed transliteration/transcription systems that transliterate a part of the original script and transcribe the rest. The word transliteration is used to include both transliteration in the narrow sense and transcription. Anglicizing is a transcription method. Romanization encompasses several transliteration and transcription methods. Transliteration should be 1
distinguished from transcription, which is a rendition of a word in a given script, based on the word's sound rather than as a process of converting of once script into another. Thus, the variants of "Muhammad" vs. "Mohammed" or "Muslim" vs. "Moslem" are variants in transcription based on the sound these words can take to the ear of an English speaker in different varieties of Arabic.It is challenging to translate names and technical terms across languages with different alphabets and sound inventories. These items are commonly transliterated, i.e., replaced with approximate phonetic equivalents. For example, "computer" in English comes out as "konpyuutaa" in Japanese.
1.2 History Transliteration has been in use in machine translation systems, e.g. Russian-English, since the existence of the field of machine translation. Arababi (1994) developed a hybrid neural network and knowledge-based system to generate multiple English pellings for Arabic person names. Knight and Graehl (1998) developed a statistical model for back transliteration to transliterate Japanese katakana into English. It was first studied as a machine learning problem using probabilistic finite-state transducers. Subsequently, the performance of this system was greatly improved by combining different spelling and phonetic models (AlOnaizan and Knight, 2002). Virga and Khudanpur (2003) and Oh and Choi (2000) adopted a source-channel approach incorporating phonetics as an intermediate representation. Huang et al. (2004) construct a probabilistic Chinese-English edit model as part of a larger alignment solution using a heuristic bootstrapped procedure. Gao (et al. 2004) used Maximum Entropy to English-Chinese transliteration. Kang and Choi (2000) used decision trees model for English-Korean transliteration. The recent work for sequence alignment on both local classifier-based modeling of complex learning problems (McCallum et al. 2000; Punyakanok and Roth, 2001), as well as global discriminative approaches based on CRFs (Lafferty et al. 2001), SVM (Taskar et al. 2005), and the Perceptron algorithm (Collins 2002) are being adopted for transliteration. Freitag and Khadivi (2007) propose a technique which combines conventional MT methods with a single layer perceptron. Neural networks have been used in NLP in the past, e.g. for machine translation (Asunci´on Casta˜no et al., 1997) and constituent parsing (Titov and Henderson, 2007). 2
However, it might not be straight-forward to obtain good results using neural networks in this domain. In general, when training a neural network, one has to choose the structure of the neural network which involves certain trade-offs. If a small network with no hidden layer is chosen, it can be efficiently trained but has very limited representational power, and may be unable to learn the relationships between the source and the target language.
1.3 Transliteration schemes The idea of transliteration is not new. For more than a century, printed books used a suitable transliteration scheme with Roman letters and diacritics to display text in Indian scripts. During the past several years, different methods have been introduced to prepare Indian language documents by entering the text through specific transliteration schemes. Data entry through transliteration is quite close to phonetic mapping of Indian language characters to the letters of the Roman alphabet. Transliteration schemes employed while entering Indian language Texts may have no connection with fonts at all. That is, the transliteration mechanism is only a means to identifying what letters, vowels, conjuncts, consonants etc. are present in the text that should be displayed in the specified Indian language. Hence the transliterated text is converted into the required encoding of the characters to be shown and this encoding is specific to the font chosen to display the text. The display of characters on a screen or printer requires what is known as a rendering program which generates the shape of the character from the encoding used to represent the character. Typically this is accomplished through fonts and web browsers have excellent capabilities to handle different types of fonts. HTML documents may contain text in specific languages by specifying the font to be used while displaying the text. This was by and large the method used to include text in other languages within a html document. It must be noted that the browser viewing the document showed be able to load the required font locally. Only then the text is correctly displayed. In the absence of the required fonts, the browser will use some default and the text will not intelligible at all. In respect of Indian languages, the method using a specific font has somehow remained in use in spite of variations observed in different fonts, even for a given language. 3
The situation has changed somewhat after Unicode support was included and today, Indian language text can be handled through Unicode. Preparing web pages (html documents) which include Indian language text requires the use of word processors which support fonts. Even with the support, additional factors must be taken into account while entering text. Due to the fairly complex nature of the Indian scripts, data entry is quite cumbersome with most word processors. A point to remember here is that word processors are not yet universal enough to run on all platforms.
1.4 Importance of Transliteration Machine transliteration plays an important role in natural language applications such as information retrieval and machine translation, especially for handling proper nouns and technical terms. The common phonetic base across all the Indian Languages is helpful in situations where language independent information such as statistical data, addresses, schedules of meetings etc., have to be disseminated in different languages simultaneously. People who can speak a language but do not know its script may still be able to read information in that language by merely reading it in a script familiar to them. Transliteration between Indian languages is very desirable to help people learn one language through another. The common phonetic base makes this easy. Yet, transliteration between the languages will have to be handled with care, for there are quite a few aksharas which are specific to some languages but not seen or used in others. For instance, Tamil does not have the aspirated consonants of Telugu or Sanskrit and reading Sanskrit through Tamil which is very desirable, is often rendered difficult. Situations such as these are usually handled by introducing new symbols in the script of a language to represent via transliteration, characters found in other languages.
4
CHAPTER 2 LITERATURE SURVEY 2.1 History of Malayalam Language The word "Malayalam" originally meant as mountainous country where mala means the mountain and alam means the place. Malayalam belongs to the southern group of Dravidian languages along with Tamil, Kota, Kodagu and Kannada. It has high affinity towards Tamil. The origin of Malayalam as a distinct language may be traced to the last quarter of 9th Century A.D. Malayalam first appeared in writing in the vazhappalli inscription which dates from about 830 AD. In the early thirteenth century the Malayalam script began to develop from a script known as vattezhuthu (round writing), a descendant of the Brahmi script. But Malayalam now is greatly simplified from 900 glyphs, which it originally had. Malayalam is written in the Malayalam script, which is derived from the Grantha script. Its rounded form was well suited to write palm leaf manuscripts, a prefered way of writing in ancient South India. Malayalam uses a large proportion of Sanskrit vocabulary. Loans have also been made from Portuguese, Arabic, Syriac, and in more recent times English.
2.2 Writing System In the early ninth century vattezhuthu (round writing) traceable through the Grantha script, to the pan-Indian Brāhmī script, gave rise to the Malayalam writing system. It is syllabic in the sense that the sequence of graphic elements means that syllables have to be read as units, though in this system the elements representing individual vowels and consonants are for the most part readily identifiable. In the 1960s Malayalam dispensed with
5
many special letters representing less frequent conjunct consonants and combinations of the vowel /u/ with different consonants. Malayalam language script consists of 53 letters including 16 vowels and 37 consonants. The earlier style of writing is now substituted with a new style from 1981. This new script reduces the different letters for typeset from 900 to fewer than 90. This was mainly done to include Malayalam in the keyboards of typewriters and computers.
2.3 Uses of Transliteration Machine transliteration plays an important role in natural language applications such as information retrieval and machine translation, especially for handling proper nouns and technical terms. It is useful for machine translation, cross-lingual information retrieval, multilingual text and speech processing. Transliteration is helpful in situations where one does not know the script of a language but knows to speak and understand the language nevertheless. Transliterations are used in situations where the original script is not available to write down a word in that script, while still high precision is required. For example, traditional or cheap typesetting with a small character set; editions of old texts in scripts not used any more (such as Linear B); some library catalogues. One useful feature of transliterated representation of Indian Language strings is that conventional string processing programs may be used to process the text. However, applications such as sorting will produce erroneous results as the sorting order of the Aksharas and Roman letters are quite different. Many string processing applications such as processing a sentence may however work properly, so long as the input strings do not contain special characters which are needed for transliteration but can cause confusion if they happen to be delimiters fixed for parsing routines. Transliteration is helpful in the following situations: •
When a user views names that are entered in a world-wide database, it is extremely helpful to view and refer to the names in the user's native script.
•
When the user performs searching and indexing tasks, transliteration can retrieve information in a different script. 6
•
When a service engineer is sent a program dump that is filled with characters from foreign scripts, it is much easier to diagnose the problem when the text is transliterated and the service engineer can recognize the characters. The term transliteration is sometimes given a narrow meaning, implying that the
transformation is reversible (sometimes called lossless).Transliteration can also be used to convert unfamiliar letters within the same script. Transliteration in the broader sense is a necessary process when you use words or concepts expressed in a language with a script other than yours. The idea of transliteration is complicated by the genuine use in multiple languages of different common nouns for the same person, place or thing. Thus, Muhammad" is in common use now in English and "Mohammed" is less popular, though there are excellent reasons for each transcription. Transliteration is also used for simple encryption.
2.4 Vagaries of Transliteration The phonetic nature of Indian languages allows more or less direct transliteration of text preserving the basic sounds across the languages. For instance, the word
can be
correctly shown in all the scripts of the country without much confusion. The following are the few transliterations
Yet when these words are individually written in these scripts, there will be some variations because the words may be pronounced a bit differently as per local convention in each region.
These differences make transliteration somewhat confusing despite the fact that the sounds may be correct. 7
In Devanagari as well as other scripts, it may not be easy to correctly display the equivalent sounds of English words. Often the same word may be written differently but with almost identical sounds so that the native speaker will correctly understand the word. This usually is the case when diphthongs are involved as in words such as “bright” or “gown”. “bright” may be written as
or
or even
. Yet when transliterated, these can lead
to amusing representations in other scripts. Hence when data common to whole country is presented in different scripts, viewers need additional time to interpret the word in the context of the original language from which the word comes.Transliteration might be appropriate for correctly showing the aksharas of one script in another. But the transliterated text will correctly understood if the reader has the knowledge of the conventions used to represent sounds and also the conventions used in writing which may not strictly adhere to the phonetic rules.
2.5 Challenges in English to Malayalam Transliteration In a new awakening for the search of one’s own roots and culture, language has become a key factor. A universally accepted method of transliteration of Malayalam using English alphabet will go a long way to make it easier for correct pronunciation. A number of characters with slight differences in sound could not be expressed correctly by using the English alphabet. We don’t have English characters to differentiate between സ and ശ and between ല and ള. The same problem we face with ത, ഥ, ദ, ധ and ട, ഠ, ഡ, ഢ. We need modifications to suit our language so that the different shades of Malayalam sound could be written correctly. In our attempt to transliterate Malayalam, the following rules have to be followed consistently to avoid confusion: 1. One alphabet will represent only one sound. 2. Avoid diacritical marks (marks or pointers attached to letters to indicate differences of sound) as far as possible. 3. Computer key boards should be able to handle all the characters.
8
4. For combined letters, the rule must be consistant. For example, if we use j for
ജ nj for ഞ then ജ്ഞ should be jnj. 5. y should never be used to indicate ഇ sound; y should only represent യ. For i sound, always use i.
6. For long vowels, use the same character twice. അ = a and അ = aa or അ= A
9
CHAPTER 3 SEQUENCE LABELING APPROACH 3.1 Transliteration as a Sequence Labeling Problem Transliteration maps the letters from the source script to the letters of the goal script. The process of transliteration mainly involves two steps: • Segmentation of the source string into transliteration units. • Mapping the source language transliteration into the target language. Thus the transliteration problem can be viewed as a sequence labeling problem from one language alphabet to another. Here the source language is English and target language is Malayalam. An English name, for example, X is segmented in to x1, x2,…,xn where xi corresponds to the alphabet in the name. Let the equivalent Malayalam name be Y and Y is segmented as y1,y2,...,yn where each yi is treated as a label in the label sequence. Each xi is now aligned with its phonetically equivalent yi. x1
x2 ……. xn
y1
y2…….. yn
To generate an efficient model the phonetically equivalent segments should be properly aligned. The valid target language alphabet (yi) for a source language alphabet (xi) in the given source language input word depends on the, alphabet (xi), alphabets (xi-2, xi-1, xi+1, xi+2) surrounding source language alphabet (xi), alphabets (yi-2, yi-1, yi+1, yi+2) surrounding target language alphabet (yi). These features are used to train the model using support vector machine. This transliteration model then used to predict a target language word for new source language word. 10
Transliteration problem can be viewed as a multiclass classification problem. Training is done for every class to distinguish the examples of each class from the rest. The most probable class labels are selected. For each alphabet in the source script a dictionary is created from the training samples. The transliteration process consists of three phases: • Preprocessing phase • Training phase • Transliteration phase
3.2 Preprocessing Phase During the preprocessing phase, the source language names are segmented and aligned with the corresponding segmented target language names. Preprocessing involves the following steps:
• Romanization • Segmentation • Alignment
3.2.1 Romanization In linguistics, romanization is the representation of a written word or spoken speech with the Roman alphabet, or a system for doing so, where the original word or language uses a different writing system. If the romanization attempts to transliterate the original script, the guiding principle is a one-to-one mapping of characters in the source language into the target script. During romanization all the English words are converted in to lowercase and converted into the corresponding Malayalam words. These Malayalam
11
words are then Romanized using the mapping rules that defines English alphabet for each Malayalam alphabet. The following table shows the Romanized output
Table 3.1 Romanized Malayalam Names
3.2.2 Segmentation English Names are segmented based on vowels, consonants, digraphs and trigraphs like sh, bh, ksh, th, ch, ng, nj etc. Similarly romanized malayalam names are segmented based on vowels, consonants digraphs and trigraphs like sh, TT, kk, ss, pp, ngk, njs etc. into transliteration units.
Table 3.2 Segmented English and Romanized Malayalam Names
12
3.2.3 Alignment After segmentation the English names and the corresponding Malayalam names are aligned. If the number of units in both English and Malayalam names are equal they are properly aligned. If the number of units, for a particular name, is different in English and Romanized Malayalam a mismatch will occur. Consider the example, s a h d e v ( 6 units ) s a h d E v ( 6units ) In the above example the number of units is the same and they can be properly aligned. A mismatch can be resolved by introducing an empty symbol or combining the adjacent units. s a d y o j a t a (9 units) s a d y y o j aa t a (10 units) The above mismatch can be resolved by combining the two adjacent symbols. s | a | d | y | o | j | a | t | a (9 units) s | a | d | yy | o | j | aa | t | a (9 units) Consider another example, s a m a t h (6 units) s a m a t (5 units) One alphabet is less in the Romanized Malayalam name. So here introduce an empty symbol ^. s | a | m | a | t | h ( 6 units ) s | a | m | a | t | ^ ( 6 units ) The labels are the target language n-grams.
13
CHAPTER 4 SUPPORT VECTOR MACHINES 4.1 Introduction Support vector machines map input vectors to a higher dimensional space where a maximal separating hyperplane is constructed. Two parallel hyperplanes are constructed on each side of the hyperplane that separates the data. The separating hyperplane is the hyperplane that maximizes the distance between the two parallel hyperplanes. An assumption is made that the larger the margin or distance between these parallel hyperplanes the better the generalizations error of the classifier will be. A Support Vector Machine (SVM) performs classification by constructing an Ndimensional hyperplane that optimally separates the data into two categories. In the parlance of SVM literature, a predictor variable is called an attribute, and a transformed attribute that is used to define the hyperplane is called a feature. The task of choosing the most suitable representation is known as feature selection. A set of features that describes one case (i.e., a row of predictor values) is called a vector. So the goal of SVM modeling is to find the optimal hyperplane that separates clusters of vector in such a way that cases with one category of the target variable are on one side of the plane and cases with the other category are on the other size of the plane. The vectors near the hyperplane are the support vectors. The figure below presents an overview of the SVM process.
14
Many linear classifiers (hyperplanes) separate the data.
Figure 4.2 Linear Classifiers
However, only one achieves maximum separation. The data is classified as a part of a machine-learning process. Each data point will be represented by a p-dimensional vector (a list of p numbers). Each of these data points belongs to only one of two classes. We are interested in whether we can separate them with a "p minus 1" dimensional hyperplane. This is a typical form of linear classifier. There are many linear classifiers that might satisfy this property. However, we are additionally interested in finding out if we can achieve maximum separation (margin) between the two classes. By this we mean that we pick the hyperplane so that the distance from the hyperplane to the nearest data point is maximized. That is to say that the nearest distance between a point in one separated hyperplane and a point in the other separated hyperplane is maximized. Now, if such a hyperplane exists, it is clearly of interest and is known as the maximum-margin hyperplane and such a linear classifier is known as a maximum margin classifier.
15
4.2 Formalization We consider data points of the form:
{ (x1, c1), (x2, c2) ,…, ( xn, cn) } where the ci is either 1 or −1, a constant denoting the class to which the point Each
belongs.
is a p-dimensional real vector, usually of normalized (Normalizing constant) [0, 1]
or [-1, 1] values. The scaling is important to guard against variables (attributes) with larger variance that might otherwise dominate the classification. We can view this as training data, which denotes the correct classification which we would like the SVM to eventually distinguish, by means of the dividing (or separating) hyperplane, which takes the form:
w.x–b=0 The vector
points perpendicular to the separating hyperplane. Adding the offset parameter
b allows us to increase the margin. In its absence, the hyperplane is forced to pass through the origin, restricting the solution. As we are interested in the maximum margin, we are interested in the support vectors and the parallel hyperplanes (to the optimal hyperplane) closest to these support vectors in either class. It can be shown that these parallel hyperplanes can be described by equations (by scaling w and b if not)
w . x – b = -1, w.x–b=1
16
Figure 4.2 : Maximum-margin hyperplanes for a SVM trained with samples from two classes.
Samples along the hyperplanes are called the support vectors. If the training data are linearly separable, we can select these hyperplanes so that there are no points between them and then try to maximize their distance. By using geometry, we find the distance between the hyperplanes is 2/|w|, so we want to minimize |w|. To exclude data points, we need to ensure that for all i either
w . xi – b ≥ 1 or w . xi – b ≤ -1 This can be rewritten as:
ci ( w . xi – b) ≥ 1,
17
1≤ i ≤ n
4.3 Primal Form The problem now is to minimize |w| subject to the constraint (1). This is a quadratic programming (QP) optimization problem. More clearly,
1 ( ) || w ||2 2
minimize,
1≤ i ≤ n
subject to
The factor of 1/2 is used for mathematical convenience. This is the Primal Form
4.4 Dual Form Writing the classification rule in its dual form reveals that classification is only a function of the support vectors, i.e., the training data that lie on the margin. The dual of the SVM can be shown to be:
n
max ∑ α i − ∑ α iα j ci c j xiT x j i =1
subject to,
i, j
αi ≥ 0
where the α terms constitute a dual representation for the weight vector in terms of the training set
w = ∑ α i ci xi i
18
4.5 Soft margin The maximum margin idea can be modified that allows for mislabeled examples If there exists no hyperplane that can split the "yes" and "no" examples, the Soft Margin method will choose a hyperplane that splits the examples as cleanly as possible, while
still
maximizing the distance to the nearest cleanly split examples. This work popularized the expression Support Vector Machine or SVM. The method introduces slack variables, ξi, which measure the degree of misclassification of the datum
xi
ci ( w . xi – b) ≥ 1- ξi , 1 ≤ i ≤ n
Figure 4.4: SVM with Soft Margin The objective function is then increased by a function which penalizes non-zero ξi, and the optimisation becomes a trade off between a large margin, and a small error penalty.
19
If the penalty function is linear, the equation now transforms to
min || w ||2 + C ∑ ξ i such that
i
ci ( w xi – b) ≥ 1- ξi , 1 ≤ i ≤ n .
This constraint in (2) along with the objective of minimizing |w| can be solved using Lagrange multipliers. The key advantage of a linear penalty function is that the slack variables vanish from the dual problem, with the constant C appearing only as an additional constraint on the Lagrange multipliers. Non-linear penalty functions have been used, particularly to reduce the effect of outliers on the classifier, but unless care is taken, the problem becomes non-convex, and thus it is considerably more difficult to find a global solution. The parameters of the maximum-margin hyperplane are derived by solving the optimization. There exist several specialized algorithms for quickly solving the QP problem that arises from SVMs, mostly reliant on heuristics for breaking the problem down into smaller, more-manageable chunks. A common method for solving the QP problem is Platt's SMO algorithm, which breaks the problem down into 2-dimensional sub-problems that may be solved analytically, eliminating the need for a numerical optimization algorithm such as conjugate gradient methods. When Straight Lines Go Crooked The simplest way to divide two groups is with a straight line, flat plane or an Ndimensional hyperplane. But what if the points are separated by a nonlinear region such as shown below?
20
Figure 4.5: Non-Linear Region In this case we need a nonlinear dividing line. Rather than fitting nonlinear curves to the data, SVM handles this by using a kernel function to map the data into a different space where a hyperplane can be used to do the separation.
The kernel function may transform the data into a higher dimensional space to make it possible to perform the separation.
21
Kernel: If data is linear, a separating hyper plane may be used to divide the data. However it is often the case that the data is far from linear and the datasets are inseparable. To allow for this kernels are used to non-linearly map the input data to a high-dimensional space. The new mapping is then linearly separable . A very simple illustration of this is shown below in figure.
Figure 4.6: Why use Kernel?
This mapping is defined by the Kernel:
Feature Space: Transforming the data into feature space makes it possible to define a similarity measure on the basis of the dot product. If the feature space is chosen suitably, pattern recognition can be easy.
x1 ⋅ x2 ← K ( x1 , x2 ) = Φ ( x1 ) ⋅ Φ ( x2 )
22
when w, b is obtained the problem is solved for a simple linear scenario in which data is separated by a hyper plane. The Kernel trick allows SVM’s to form nonlinear boundaries. Steps involved in kernel trick are given below [a] The algorithm is expressed using only the inner products of data sets. This is also called as dual problem. [b] Original data are passed through non linear maps to form new data with respect to new dimensions by adding a pair wise product of some of the original data dimension to each data vector. [c] Rather than an inner product on these new, larger vectors, and store in tables and later do a table lookup, we can represent a dot product of the data after doing non linear mapping on them. This function is the kernel function. Kernel Trick: Inner Product summarization Here we see that we need to represent the dot product of the data vectors used. The dot product of nonlinearly mapped data can be expensive. The kernel trick just picks a suitable function that corresponds to dot product of some nonlinear mapping instead. Some of the most commonly chosen kernel functions are given below in later part of this tutorial. A particular kernel is only chosen by trial and error on the test set, choosing the right kernel based on the problem or application would enhance SVM’s performance.
4.6 SVM for Classification SVM is a useful technique for data classification. Even though it’s considered that Neural Networks are easier to use than this, however, sometimes unsatisfactory results are obtained. A classification task usually involves with training and testing data which consist of some data instances. Each instance in the training set contains one target values and several attributes. The goal of SVM is to produce a model which predicts target value of data instances in the testing set which are given only the attributes.
23
Classification in SVM is an example of Supervised Learning. Known labels help indicate whether the system is performing in a right way or not. This information points to a desired response, validating the accuracy of the system, or be used to help the system learn to act correctly. A step in SVM classification involves identification as which are intimately connected to the known classes. This is called feature selection or feature extraction. Feature selection and SVM classification together have a use even when prediction of unknown samples is not necessary. They can be used to identify key sets which are involved in whatever processes distinguish the classes.
4.7 Applications of SVM SVM has been found to be successful when used for pattern classification problems. Applying the Support Vector approach to a particular practical problem involves resolving a number of questions based on the problem definition and the design involved with it. One of the major challenges is that of choosing an appropriate kernel for the given application. There are standard choices such as a Gaussian or polynomial kernel that are the default options, but if these prove ineffective or if the inputs are discrete structures more elaborate kernels will be needed. By implicitly defining a feature space, the kernel provides the description language used by the machine for viewing the data. The task of text categorization is the classification of natural text documents into a fixed number of predefined categories based on their content. Since a document can be assigned to more than one category this is not a multi-class classification problem, but can be viewed as a series of binary classification problems, one for each category. One of the standard representations of text for the purposes of information retrieval provides an ideal feature mapping for constructing a Mercer kernel. Indeed, the kernels somehow incorporate a similarity measure between instances, and it is reasonable to assume that experts working in the specific application domain have already identified valid similarity measures, particularly in areas such as information retrieval and generative models. Traditional classification approaches perform poorly when working directly because of the high dimensionality of the data, but Support Vector Machines can avoid the pitfalls of 24
very high dimensional representations. A very similar approach to the techniques described for text categorization can also be used for the task of image classification, and as in that case linear hard margin machines are frequently able to generalize well. The first real-world task on which Support Vector Machines were tested was the problem of hand-written character recognition. Furthermore, multi-class SVMs have been tested on these data. It is interesting not only to compare SVMs with other classifiers, but also to compare different SVMs amongst themselves. They turn out to have approximately the same performance, and furthermore to share most of their support vectors, independently of the chosen kernel. The fact that SVM can perform as well as these systems without including any detailed prior knowledge is certainly remarkable
4.8 Strength and Weakness of SVM The major strengths of SVM are the training is relatively easy. No local optimal, unlike in neural networks. It scales relatively well to high dimensional data and the trade-off between classifier complexity and error can be controlled explicitly. The weakness includes the need for a good kernel function
4.9 SVMTool The SVMTool is a simple and effective generator of sequential taggers based on Support Vector Machines. We have applied the SVMTool to the problem of part-of-speech (PoS) tagging. By means of a rigorous experimental evaluation, we conclude that the proposed SVM-based tagger is robust and flexible for feature modeling (including lexicalization), trains efficiently with almost no parameters to tune, and is able to tag thousands of words per second, which makes it really practical for real NLP applications. The SVMTool is intended to comply with all the requirements of modern NLP technology, by combining simplicity, flexibility, robustness, portability and efficiency with state–of–the–art accuracy. This is achieved by working in the Support Vector Machines (SVM) learning framework. 25
The properties the SVMTool is intended to exhibit are: •
Simplicity The SVMTool is easy to configure and to train. The learning is controlled by means of a very simple configuration file. There are very few parameters to tune. And the tagger itself is very easy to use, accepting standard input and output pipelining. Embedded usage is also supplied by means of the SVMTool API.
•
Flexibility The size and shape of the feature context can be adjusted. Also, rich features can be defined, including word and POS n-grams as well as ambiguity classes and “may be’s”, apart from lexicalized features for unknown words and sentence general information. The behaviour at tagging time is also very flexible, allowing different strategies.
•
Robustness The overfitting problem is well addressed via the tunning of the C parameter in the soft margin version of the SVM learning algorithm. Also, a sentence level analysis may be performed in order to maximize the sentence score. And, for unknown words not to punish so severely on the system effectiveness, several strategies have been implemented and tested.
•
Portability The SVMTool is language independent. It has been successfully applied to English and Spanish without a priori knowledge other than a supervised corpus. Moreover, thinking of languages for which labeled data is a scarce resource, the SVMTool also may learn from unsupervised data based on the role of nonambiguous words with the only additional help of a morpho-syntactic dictionary.
•
Accuracy Compared to state–of–the–art POS taggers reported up to date, it exhibits a very competitive accuracy (over 97.1% for English on the WSJ corpus). Clearly, rich sets of features allow modeling very precisely most of the information involved. Also the learning paradigm, SVM, is very suitable for working accurately and efficiently with high dimensionality feature spaces.
26
4.9.1Components The SVMTool consists of three main components: •
SVMTlearn
•
SVMTagger
•
SVMTeval
These are namely the learner, tagger and evaluator. SVMTlearn: Given a training set of examples (either annotated or unannotated), it is responsible for the training of a set of SVM classifiers. SVMTagger: Given a text corpus (one token per line) and the path to a previously learned SVM model, it performs the sequential tagging of a sequence of words. SVMTeval: Given a SVMTool predicted tagging output and the corresponding goldstandard, SVMTeval evaluates the performance in terms of accuracy.
27
CHAPTER 5 WEKA Machine Learning Algorithms in Java 5.1 Introduction Weka is a collection of machine learning algorithms for data mining tasks. The algorithms can either be applied directly to a dataset or called from your own Java code. Weka contains tools for data pre-processing, classification, regression, clustering, association rules, and visualization. It is also well-suited for developing new machine learning schemes. Weka is open source software issued under the GNU General Public License. “Weka” stands for the Waikato Environment for Knowledge Analysis. Weka provides implementations of state-of-the-art learning algorithms that can be applied to datasets. It also includes a variety of tools for transforming datasets, like the algorithms for discretization. We can preprocess a dataset, feed it into a learning scheme, and analyze the resulting classifier and its performance.
5.2 Features The main strengths of Weka are that it is •
freely available under the GNU General Public License,
•
very portable because it is fully implemented in the Java programming language and thus runs on almost any modern computing platform,
•
contains a comprehensive collection of data preprocessing and modeling techniques, and
•
is easy to use by a novice due to the graphical user interfaces it contains.
28
Weka supports several standard data mining tasks, more specifically, data preprocessing, clustering, classification, regression, visualization, and feature selection. All of Weka's techniques are predicated on the assumption that the data is available as a single flat file or relation, where each data point is described by a fixed number of attributes, normally, numeric or nominal attributes, but some other attribute types are also supported. Weka provides access to SQL databases using Java Database Connectivity and can process the result returned by a database query. It is not capable of multi-relational data mining, but there is separate software for converting a collection of linked database tables into a single table that is suitable for processing using Weka. Another important area that is currently not covered by the algorithms included in the Weka distribution is sequence modeling. Weka's main user interface is the Explorer, but essentially the same functionality can be accessed through the component-based Knowledge Flow interface and from the command line. There is also the Experimenter, which allows the systematic comparison of the predictive performance of Weka's machine learning algorithms on a collection of datasets. The Explorer interface has several panels that give access to the main components of the workbench. The Preprocess panel has facilities for importing data from a database, a CSV file, etc., and for preprocessing this data using a so-called filtering algorithm. These filters can be used to transform the data and make it possible to delete instances and attributes according to specific criteria. The Classify panel enables the user to apply classification and regression algorithms to the resulting dataset, to estimate the accuracy of the resulting predictive model, and to visualize erroneous predictions, ROC curves, etc., or the model itself. The Associate panel provides access to association rule learners that attempt to identify all important interrelationships between attributes in the data. The Cluster panel gives access to the clustering techniques in Weka, e.g., the simple k-means algorithm. There is also an implementation of the expectation maximization algorithm for learning a mixture of normal distributions. The next panel, Select attributes provides algorithms for identifying the most predictive attributes in a dataset. The last panel, Visualize, shows a scatter plot matrix, where individual scatter plots can be selected and enlarged, and analyzed further using various selection operators. 29
5.3 Classifiers in WEKA 5.3.1 J48 A decision tree is a predictive machine-learning model that decides the target value (dependent variable) of a new sample based on various attribute values of the available data. The internal nodes of a decision tree denote the different attributes, the branches between the nodes tell us the possible values that these attributes can have in the observed samples, while the terminal nodes tell us the final value (classification) of the dependent variable. The attribute that is to be predicted is known as the dependent variable, since its value depends upon, or is decided by, the values of all the other attributes. The other attributes, which help in predicting the value of the dependent variable, are known as the independent variables in the dataset. The J48 Decision tree classifier follows the following simple algorithm. In order to classify a new item, it first needs to create a decision tree based on the attribute values of the available training data. So, whenever it encounters a set of items (training set) it identifies the attribute that discriminates the various instances most clearly. This feature that is able to tell us most about the data instances so that we can classify them the best is said to have the highest information gain. Now, among the possible values of this feature, if there is any value for which there is no ambiguity, that is, for which the data instances falling within its category have the same value for the target variable, then we terminate that branch and assign to it the target value that we have obtained. For the other cases, we then look for another attribute that gives us the highest information gain. Hence we continue in this manner until we either get a clear decision of what combination of attributes gives us a particular target value, or we run out of attributes. In the event that we run out of attributes, or if we cannot get an unambiguous result from the available information, we assign this branch a target value that the majority of the items under this branch possess. Now that we have the decision tree, we follow the order of attribute selection as we have obtained for the tree. By checking all the respective attributes 30
and their values with those seen in the decision tree model, we can assign or predict the target value of this new instance. 5.3.2 Naive Bayes Classifier The Naïve Bayes classifier works on a simple, but comparatively intuitive concept. Also, in some cases it is also seen that Naïve Bayes outperforms many other comparatively complex algorithms. It makes use of the variables contained in the data sample, by observing them individually, independent of each other. The Naïve Bayes classifier is based on the Bayes rule of conditional probability. It makes use of all the attributes contained in the data, and analyses them individually as though they are equally important and independent of each other. For example, consider that the training data consists of various animals (say elephants, monkeys and giraffes), and our classifier has to classify any new instance that it encounters. We know that elephants have attributes like they have a trunk, huge tusks, a short tail, are extremely big, etc. Monkeys are short in size, jump around a lot, and can climb trees; whereas giraffes are tall, have a long neck and short ears. The Naïve Bayes classifier will consider each of these attributes separately when classifying a new instance. So, when checking to see if the new instance is an elephant, the Naïve Bayes classifier will not check whether it has a trunk and has huge tusks and is large. Rather, it will separately check whether the new instance has a trunk, whether it has tusks, whether it is large, etc. It works under the assumption that one attribute works independently of the other attributes contained by the sample.
31
CHAPTER 6 IMPLEMENTATION 6.1 Training using SVM 6.1.1 Training Phase The aligned source language and target language names are given as input sequence and label sequence respectively for training in the format as required by SVMTool. The features required for training are defined with a window size of 5 elements. The core alphabet is in the third position. SVMTool uses SVMlight 2 for training. SVM learning uses linear kernel and the learning time remains linear with respect to the number of examples. The trained model is used for transliterating English words into Malayalam words. 6.1.2 Output of Training phase # SVMTool v1.3 MERGED PRIMAL MODEL # SLIDING WINDOW: length [5] :: core [2] # FEATURE FILTERING: min frequency [2] :: max mapping size [100000] # C-PARAMETER: 0.1086 # ================================================================== ======= BIASES .:-1 . :1 A :1 B:-1 B :1 BA:1.026159 BA :1 BE:0.30015994 BE :1 BI:1.0790639 BI :1 BO:0.90864466 BO :1 BU:0.13422592 BU :1 Ba:-1.0262181 Ba 32
:1 Bai:-1 Bai :1 Bau:-1 Bau :1 Be:-0.30023148 Be :1 Bi:-1.0788952 Bi :1 Bo:-0.66410723 Bo :1 Bu:-0.99627329 Bu :1 D:1.0585343 D :1 DA :1 DD :1 DDa :1 DDi:0.61084263 DDi :1 DE :1 DI:-0.26195181 DI :1 DO :1 DU :1 Da:0.88534464 Da :1 Dai :1 De:0.63915061 De :1 Di:0.69658084 Di :1 Do :1 Du:0.72052673 Du :1 E :1 I :1 L:0.99275391 L :1 LA :1 LE :1 LI :1 LLA :1 LLU :1 LLa :1 LLi:0.63329299 LLi :1 LLu :1 LO :1 LU :1 La:1.0810985 La :1 Lai :1 Lau :1 Le:0.732714 Le :1 Li:0.73111522 Li :1 Lo :1 Lu :1 M:-0.77867946 M :1 MO :0.99999624 Mg:1.2030863 Mg :1 MgA:1.3113779 MgA :1 Mga:0.15844957 Mga :1 Mgai :1 Mgi:0.53673795 Mgi :1 Mgo 33
:1 Mgu :1 N:1.4031759 N :1 NA :1 NE :1 NNA :1 NNU :1 NNa :1 NNai :1 NNi :1 NNo :1 NNu :1 Na:1.3846817 Na :1 Nai :1 Ne :1 Ni :1 O :1 Oyi :1 R :1 RA :1 RE :1 RI :1 RO :1 RU :1 Ra:-0.94502191 Ra :1 Rai:1.1648808 Rai :1 Rau :1 Re :1 Ri:0.76515831 Ri :1 Ro:1.0603933 Ro :1 Ru:1.4100851 Ru :1 S:0.51063118 S :1 SA :1 SE :1 SSE :1 Sa:0.98612503 Sa :1 Si :1 So :1 Su :1 T :1 TA :1 TH :1 THA :1 THa :1 THu:0.88859232 THu :1 TO :1 TT:1.2633621 TT :1 TTA :1 TTI:0.91457815 TTO :1 TTU 34
:1 TTa:-0.49720756 TTa :1 TTai :1 TTe:-0.86513683 TTe :1 TTi:-0.82220871 TTi :1 TTo :1 TTu:-0.66928023 TTu :1 TU :1 Ta :1 Tai :1 Tau:-1 Tau :1 Te :1 ThA :1 Tha :1 Ti:0.91730809 Ti :1 To :1 Tu :1 U :1 ^:0.7030488 ^ :1 a :1 ai:1.1337818 ai :1 ao :1 au :1 b:-1.2694247 b :1 bA:1.083935 bA :1 bE:0.99507031 bE :1 bI:1.2556375 bI :1 bO:1.0041648 bO :1 bU :1 ba:0.60947595 ba :1 bai:-1 bai :1 bau:-1 bau :1 be:-0.87267121 be :1 bi:-0.36539495 bi :1 bo:-0.93954238 bo :1 bu:-0.80388476 bu :1 byu:1.0216254 byu :1 c:-0.72239043 c :1 cA:1.0693684 cA :1 cE :1 cI :1 cO :1 cU :1 ca:-0.88436613 ca :1 cai:-0.72476016 cai :1 cau:-1 cau :1 cca :1 cci :1 ce:-0.93454746 ce :1 ci:-1.0078654 ci 35
:1 co:-0.74432309 co :1 cu:-0.88437951 cu :1 d:-0.20965886 d :1 dA :1 dE:0.8807894 dE :1 dI:0.26195173 dI :1 dO :1 dU :1 da:0.091420319 da :1 dai:-0.94610091 dai :1 dau:-1 dau :1 dd:-0.1417281 dd :1 ddE :1 dda:-0.8650739 dda :1 dde :1 ddh:1.0000001 ddi:-0.61063611 ddi :1 ddu:-1 ddu :1 de:-0.19739668 de :1 dh:-1 dh :1 dhA:0.89771243 dhA :1 dhE :1 dhO :1 dha:-0.091740268 dha :1 dhe:-0.71133638 dhe :1 dhi:-1 dhi :1 dho:-1.2176525 dho :1 dhu:-1 dhu :1 di:-0.6307089 di :1 do:-0.98432977 do :1 du:-0.58079515 du :1 e :1 g:-0.95551082 g :1 gA :1 gE:0.82230172 gE :1 gO :1 gU:-1 gU :1 ga:-0.24069142 ga :1 gai:-1 gai :1 ge:-0.26764145 ge :1 gga:-1 gga :1 gge :1 ggu:-1 ggu :1 gh:-0.78186298 gh :1 ghA:0.73505864 ghA :1 ghO :1 gha:-0.66576524 gha :1 ghe :1 gho:-0.85423784 gho :1 gi:-1 gi 36
:1 go:-0.92493917 go :1 gu:-0.88879944 gu :1 h:-1.3472981 h :1 hA:0.87979418 hA :1 hE:1.0837696 hE :1 hI :1 hO :1 ha:-0.88011626 ha :1 hai:-1 hai :1 hau:-1 hau :1 he:-0.96873172 he :1 hi:-1.2621969 hi :1 ho:-1.0176008 ho :1 hu:-1.4615195 hu :1 i:0.83683803 i :1 j:-0.95420889 j :1 jA :1 jE :1 jI :1 jO :1 ja:-1.070135 ja :1 jaa :1 jai:-1 jai :1 jau:-1 jau :1 jc :1 jcE :1 jcI :1 jca:0.97613289 jca :1 jcai:0.72476016 jcai :1 jce :1 jci :1 jco :1 jcu :1 je:-0.69988227 je :1 jhA :1 jha :1 jhu :1 ji:-1.1518849 ji :1 jja:-1 jja :1 jo:-1.0588038 jo :1 ju:-1.1573227 ju :1 k:-1.0915657 k :1 kA :1 kE :1 kI :1 kO:0.66516353 kO :1 kU :1 ka:-1.0081873 ka :1 ke:-0.85635381 ke 37
:1 kh:-1 kh :1 khA :1 khE :1 kha:-1.1714832 kha :1 khe:-1.5226941 khe :1 khi:-1 khi :1 kho:-1.3896105 kho :1 khu:-1 khu :1 ki:-1.0529199 ki :1 kk:-0.99999999 kk :1 kkA :1 kkE :1 kkI :1 kkO :1 kka:-0.9354565 kka :1 kke :1 kki:-0.9048429 kki :1 kko :1 kku:-0.60260563 kku :1 ko:-0.61037907 ko :1 ks :1 ksA :1 ksa:-0.11123815 ksa :1 ksh :1 kshA :1 kshE :1 ksha :1 kshe :1 ksi :1 ku:-0.35923501 ku :1 kyA :1 kya :1 l:-0.96862081 l :1 lA :1 lE :1 lI :1 lO:1.1054963 lO :1 lU:1.5434351 lU :1 la:-0.44000467 la :1 lai:-0.77677422 lai :1 lau:-0.73908599 lau :1 le:-0.75280442 le :1 li:-0.64277676 li :1 ll:-0.88970845 ll :1 llA:0.80175372 llA :1 llE :1 llU :1 lla:-0.70219441 lla :1 lle:-0.77843393 lle 38
:1 lli:-0.63318291 lli :1 llo :1 llu:-1.353852 llu :1 lo:-0.59228616 lo :1 lth :1 lu:-0.72346001 lu :1 m:0.91204562 m :1 mA :1 mE :1 mI :1 mO:1.4191811 mO :0.99999624 mU :1 ma:-1.0918967 ma :1 mai:-1 mai :1 mau:-1 mau :1 me:-1.0361617 me :1 mi:-1.0916945 mi :1 mmA :1 mmO :1 mma:-1.0476645 mma :1 mmo :1 mo:-1.3082614 mo :0.99999624 mu:-1.3918741 mu :1 n:-0.90820607 n :1 nA:0.81857763 nA :1 nE :1 nI :1 nO :1 nU :1 na:0.68132004 na :1 nai:-0.95491695 nai :1 nau:-1 nau :1 ne:0.21776247 ne :1 ng:0.87509003 ng :1 ngA :1 nga :1 ngi :1 ngka :1 ngke :1 ngng:-0.37797848 ngng :1 ngngA :1 ngngO :1 ngnga:1.3795945 ngnga :1 ngngai :1 ngnge:-0.50127446 ngnge :1 ngngi:-0.30279481 ngngi :1 ngngo:-0.80900766 ngngo :1 ngngu:-0.90533276 ngngu :1 ngu 39
:1 ni:-1.2637661 ni :1 nj :1 nja :1 njc :1 njca :1 njco :1 njcu :1 njj:0.64180547 njj :1 njjI :1 njja:0.60402547 njja :1 njje :1 njji :1 njjo:0.76050838 njnj :1 njnjA :1 njnjI :1 njnja:-0.18681776 njnja :1 njnje :1 njnji :1 njnjo :1 njnju:-0.3758853 njnju :1 nju :1 nka :1 nn :1 nnA :1 nnO :1 nnU :1 nna:-0.79011199 nna :1 nnai :1 nne :1 nni:-0.89659323 nni :1 nno :1 nnu:-1.1829071 nnu :1 no:-0.98562807 no :1 nu:-1.1416216 nu :1 o :1 p:-0.9984441 p :1 pA:1.4622373 pA :1 pE:0.97829866 pE :1 pO :1 pU :1 pa:-0.36180371 pa :1 pe:-0.56423281 pe :1 ph :1 phA :1 phI :1 phO :1 pha :1 phe :1 phi:-0.84855401 phi 40
:1 pho :1 phu :1 pi:-1 pi :1 po:-0.78316771 po :1 ppA :1 ppO :1 ppU:0.76495455 ppU :1 ppa:-0.87149392 ppa :1 ppe :1 ppo :1 ppu :1 pu:-0.11069092 pu :1 r:-2.5347676 r :1 rA:-1.3089646 rA :1 rE:1.2132155 rE :1 rI:0.25096324 rI :1 rO:1.0080541 rO :1 rU :1 ra:0.66536263 ra :1 rai:-1.1651077 rai :1 rau:-1.1646464 rau :1 re:-0.96624524 re :1 ri:-0.67361452 ri :1 ro:-0.52632367 ro :1 ru:-0.83622064 ru :1 s:-0.73175982 s :1 sA:1.1721427 sA :1 sE :1 sI :1 sO :1 sU :1 sa:-1.3018952 sa :1 sai:-1 sai :1 sau:-1 sau :1 se:-0.69024222 se :1 sh:-0.51037709 sh :1 shA:0.93731157 shA :1 shE :1 shI :1 shO :1 sha:-0.69140825 sha :1 she:-0.95868055 she :1 shi:-0.90323729 shi :1 sho :1 shu:-1 shu :1 si:-1.1169191 si :1 so:-0.95327106 so :1 ss :1 ssE 41
:1 ssa:-0.85546481 ssa :1 sse :1 su:-1.0212922 su :1 t:0.17113757 t :1 tA :1 tE :1 tO :1 tU :1 ta:1.4824991 ta :1 tai :1 tau :1 te:-0.34799654 te :1 th:-1.0019204 th :1 thA :1 thE:0.98347143 thE :1 thI :1 thO:0.89300578 thO :1 thU :1 tha:-1.3831379 tha :1 thai :1 thau:0.45255517 thau :1 the:0.68272861 the :1 thi:-1.9903062 thi :1 tho:-0.19501025 tho :1 thth :1 ththA :1 ththE :1 ththU :1 ththa :1 ththi :1 ththu :1 thu:-0.91444348 thu :1 ti:0.50082636 ti :1 to:-0.31294157 to :1 tu :1 u:-0.27162559 u :1 ua :1 v:-1.2840882 v :1 vA:0.46146441 vA :1 vE:0.7582573 vE :1 vI :1 vO :1 vU :1 va:0.097769596 va :1 vaa :1 vai:-1 vai :1 ve:-0.76115973 ve :1 vha :1 vi:-1.0333464 vi 42
:1 vo:-0.91961646 vo :1 vu:-0.73788894 vu :1 vva :1 y:-0.52512967 y :1 yA:1.1920068 yA :1 yI :1 yO :1 yU :1 ya:-0.99012137 ya :1 ye :1 yi:0.18210035 yi :1 yo:0.14242928 yo :1 yu:-0.65601987 yu :1 yyA :1 yya :1 zhI :1 zhi :1 Ba:-0 D:-0 MO:0 T:-0 TTI:0 bA:-0 bE:-0 ba:-0 be:-0 d:-0 ddh:0 mO:0 mo:0 njjo:0 :1 Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-1.04828147969798e-07 T:8.87726942275319e-13 TTI:-3.17806761809813e-18 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 d:1.86970029414429e-11 ddh:1.4172952320153e-17 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 njjo:-8.40256683676266e-19 C0~-1,0,1:a~ba~d Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-1,0,1:ba~d~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~-1,0,1:do~ti~. TTI:-0.000265091182345496 C0~-1,0,1:ha~ti~. TTI:-0.000983122252755619 C0~-1,0,1:ku~ti~a TTI:-0.000200245165257415 C0~-1,0,1:l~mo~ra MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~-1,0,1:n~dha~r ddh:-2.50768775655121e-10 C0~-1,0,1:n~dha~ri ddh:-1.29615147987424e-10 C0~-1,0,1:n~ti~. TTI:-0.000391381642373814 C0~-1,0,1:pa~ti~. TTI:-0.000842676216647677 C0~-1,0,1:ro~ti~. TTI:-0.00110088935224245 C0~-1,0,1:va~ti~. TTI:-0.000678453894341931 C0~-1,0:a~ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-1,0:a~dha ddh:-3.2727235499538e-19 C0~-1,0:a~ti TTI:-0.00341006431381021 C0~-1,0:ba~d D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e11 C0~-1,0:ba~ti TTI:-0.00248247078730975 C0~-1,0:bha~ti TTI:-0.000706083090522109 C0~-1,0:bhi~ti TTI:-0.000847158458970901 C0~-1,0:bu~dha ddh:8.10562606827128e-10 43
C0~-1,0:bu~ti TTI:-0.00159279963573215 C0~-1,0:do~ti TTI:-0.000265091182345496 C0~-1,0:ga~ti TTI:-0.0145174035046015 C0~-1,0:go~ti TTI:-0.00224778953577349 C0~-1,0:ha~ti TTI:-0.000983122252755619 C0~-1,0:i~ti TTI:-0.000147490387747709 C0~-1,0:ka~ti TTI:-0.00138153362588156 C0~-1,0:ku~ti TTI:-0.00195366875203697 C0~-1,0:li~mo MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-1,0:l~mo MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~-1,0:l~ti TTI:-0.00418913059586931 C0~-1,0:mo~ti TTI:-0.00129966236834032 C0~-1,0:m~dha ddh:-1.46333064123832e-10 C0~-1,0:m~ti TTI:-0.00233978267051983 C0~-1,0:n~dha ddh:-3.80383923642545e-10 C0~-1,0:n~ti TTI:-0.00326608699040271 C0~-1,0:pa~ti TTI:-0.000842676216647677 C0~-1,0:p~ti TTI:-0.00136894394594989 C0~-1,0:ri~ti TTI:0.0584879530980724 C0~-1,0:ro~ti TTI:-0.00110088935224245 C0~-1,0:r~ti TTI:-0.000739228146821998 C0~-1,0:s~ti TTI:-0.00217452842494425 C0~-1,0:va~ti TTI:-0.00112546648594307 C0~-1,1,2:a~d~. Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-1,1,2:ba~.~_ D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~-1,1,2:ba~m~. TTI:-0.002259878048601 C0~-1,1,2:ba~pu~r TTI:-0.000222592738708743 C0~-1,1,2:bha~n~da TTI:-3.77474009231699e-05 C0~-1,1,2:bhi~n~di TTI:-0.000847158458970901 C0~-1,1,2:do~.~_ TTI:-0.000265091182345496 C0~-1,1,2:ga~.~_ njjo:-0.00650506598472048 C0~-1,1,2:ga~ti~. njjo:-0.000135085406972062 C0~-1,1,2:go~pu~r TTI:-0.00224778953577349 C0~-1,1,2:ha~.~_ TTI:-0.000983122252755619 C0~-1,1,2:ka~ng~. TTI:-0.000111604940724642 C0~-1,1,2:ka~va~. TTI:-7.76041480165313e-05 C0~-1,1,2:ko~.~_ ddh:-2.83845604763285e-10 C0~-1,1,2:ku~li~. njjo:-0.0013669466430929 C0~-1,1,2:la~ra~. njjo:-0.0306792118542371 C0~-1,1,2:li~d~. MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-1,1,2:ma~ra~. MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C0~-1,1,2:mu~la~. njjo:-0.00125303442283516 C0~-1,1,2:m~la~. ddh:-1.46333064123832e-10 44
C0~-1,1,2:no~.~_ TTI:-0.00147468337913345 C0~-1,1,2:n~.~_ TTI:-0.000391381642373814 C0~-1,1,2:n~ri~ri ddh:-1.29615147987424e-10 C0~-1,1,2:n~r~. ddh:-2.50768773261406e-10 C0~-1,1,2:n~r~ka ddh:-2.39371510894065e-18 C0~-1,1,2:pa~.~_ TTI:-0.000842676216647677 C0~-1,1,2:ro~.~_ TTI:-0.00110088935224245 C0~-1,1,2:sa~r~. njjo:-0.000148244086889984 C0~-1,1,2:she~.~_ TTI:-0.00170499750240454 C0~-1,1,2:va~.~_ TTI:-0.000678453894341931 C0~-1,1,2:va~ku~n TTI:-0.000447012591601141 C0~-1,1:a~d Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-1,1:a~u ddh:-3.2727235499538e-19 C0~-1,1:ba~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e11 C0~-1,1:ba~m TTI:-0.002259878048601 C0~-1,1:ba~pu TTI:-0.000222592738708743 C0~-1,1:bha~ga TTI:-0.000597389451783517 C0~-1,1:bha~n TTI:-3.77474009231699e-05 C0~-1,1:bhi~n TTI:-0.000847158458970901 C0~-1,1:bi~ra njjo:0.051172445470808 C0~-1,1:bu~ba ddh:-3.05524784074626e-18 C0~-1,1:bu~ga ddh:2.64794548787695e-09 C0~-1,1:bu~kha ddh:-7.08505145415428e-10 C0~-1,1:bu~n TTI:-0.000398899396964083 C0~-1,1:bu~na ddh:-3.72750083329056e-18 C0~-1,1:do~. TTI:-0.000265091182345496 C0~-1,1:ga~. njjo:-0.00650506598472048 C0~-1,1:ga~na TTI:-0.0145174035046015 C0~-1,1:ga~ti njjo:-0.000135085406972062 C0~-1,1:go~pu TTI:-0.00224778953577349 C0~-1,1:ha~. TTI:-0.000983122252755619 C0~-1,1:ka~ga TTI:-0.00119232453714038 C0~-1,1:ka~ng TTI:-0.000111604940724642 C0~-1,1:ka~va TTI:-7.76041480165313e-05 C0~-1,1:ki~ba TTI:-0.000123415793751178 C0~-1,1:ko~. ddh:-2.83845604763285e-10 C0~-1,1:ku~a TTI:-0.000200245165257415 C0~-1,1:ku~li njjo:-0.0013669466430929 C0~-1,1:la~ra njjo:-0.0306792118542371 C0~-1,1:li~d MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-1,1:l~k TTI:-0.00405809811408058 C0~-1,1:l~ra MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~-1,1:l~ya TTI:-0.000131032481788732 C0~-1,1:ma~ra MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 45
C0~-1,1:mu~la njjo:-0.00125303442283516 C0~-1,1:m~la ddh:-1.46333064123832e-10 C0~-1,1:m~ra TTI:-0.00233978267051983 C0~-1,1:nga~r TTI:-0.00190506164886276 C0~-1,1:no~. TTI:-0.00147468337913345 C0~-1,1:n~. TTI:-0.000391381642373814 C0~-1,1:n~ko TTI:-0.00287470534802889 C0~-1,1:n~r ddh:-2.50768775655121e-10 C0~-1,1:n~ri ddh:-1.29615147987424e-10 C0~-1,1:pa~. TTI:-0.000842676216647677 C0~-1,1:p~a TTI:-0.00136894394594989 C0~-1,1:ri~lai TTI:-0.00766143234527358 C0~-1,1:ri~s TTI:-0.00703525043711547 C0~-1,1:ro~. TTI:-0.00110088935224245 C0~-1,1:r~ya TTI:-0.000739228146821998 C0~-1,1:sa~r njjo:-0.000148244086889984 C0~-1,1:she~. TTI:-0.00170499750240454 C0~-1,1:s~s TTI:-0.00217452842494425 C0~-1,1:va~. TTI:-0.000678453894341931 C0~-1,1:va~ku TTI:-0.000447012591601141 C0~-1,1:ya~h njjo:-0.00697478417695654 C0~-1:a Ba:-3.67737675462012e-11 TTI:-0.00341006431381021 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.2727235499538e-19 C0~-1:ba D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00248247078730975 d:-1.86970029414429e-11 C0~-1:be TTI:-5.46414505266872e-05 C0~-1:bha TTI:-0.000706083090522109 C0~-1:bhi TTI:-0.000847158458970901 C0~-1:bi njjo:0.051172445470808 C0~-1:bu TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C0~-1:do TTI:-0.000265091182345496 C0~-1:e TTI:-0.00277669394409798 C0~-1:ga TTI:-0.0145174035046015 njjo:-0.00664015139169255 C0~-1:gi njjo:-0.00411007289510381 C0~-1:go TTI:-0.00224778953577349 C0~-1:ha TTI:-0.000983122252755619 C0~-1:i TTI:-0.000147490387747709 C0~-1:j MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C0~-1:ka TTI:-0.00138153362588156 C0~-1:kh TTI:-3.2644782834012e-05 C0~-1:ki TTI:-0.000123415793751178 C0~-1:ko ddh:-2.83845604763285e-10 C0~-1:ku TTI:-0.00195366875203697 njjo:-0.0013669466430929 C0~-1:l MO:-2.41774429664504e-08 TTI:-0.00418913059586931 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 C0~-1:la njjo:-0.0306792118542371 46
C0~-1:li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-1:m TTI:-0.00233978267051983 ddh:-1.46333064123832e-10 C0~-1:ma MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C0~-1:mo TTI:-0.00129966236834032 C0~-1:mu njjo:-0.00125303442283516 C0~-1:n TTI:-0.00326608699040271 ddh:-3.80383923642545e-10 C0~-1:nga TTI:-0.00190506164886276 C0~-1:no TTI:-0.00147468337913345 C0~-1:p TTI:-0.00136894394594989 C0~-1:pa TTI:-0.000842676216647677 C0~-1:r TTI:-0.000739228146821998 C0~-1:rai TTI:-0.00143474387129284 C0~-1:ri TTI:0.0584879530980724 C0~-1:ro TTI:-0.00110088935224245 C0~-1:s TTI:-0.00217452842494425 C0~-1:sa njjo:-0.000148244086889984 C0~-1:she TTI:-0.00170499750240454 C0~-1:va TTI:-0.00112546648594307 C0~-1:ya njjo:-0.00697478417695654 C0~-2,-1,0:_~a~ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-2,-1,0:_~bha~ti TTI:-0.000706083090522109 C0~-2,-1,0:_~bu~dha ddh:8.10562606827128e-10 C0~-2,-1,0:_~bu~ti TTI:-0.00159279963573215 C0~-2,-1,0:_~mo~ti TTI:-0.00129966236834032 C0~-2,-1,0:a~ba~d D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~-2,-1,0:a~n~dha ddh:-1.29615150381139e-10 C0~-2,-1,0:a~r~ti TTI:-0.000739228146821998 C0~-2,-1,0:ga~ba~ti TTI:-0.000222592738708743 C0~-2,-1,0:ga~n~ti TTI:-0.00287470534802889 C0~-2,-1,0:ku~n~ti TTI:-0.000330257461449756 C0~-2,-1,0:r~ka~ti TTI:-0.000189209088741174 C0~-2,-1,1:_~a~d Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~-2,-1,1:_~a~u ddh:-3.2727235499538e-19 C0~-2,-1,1:_~bha~ga TTI:-0.000597389451783517 C0~-2,-1,1:_~bha~n TTI:-3.77474009231699e-05 C0~-2,-1,1:_~bhi~n TTI:-0.000847158458970901 C0~-2,-1,1:_~bi~ra njjo:0.051172445470808 C0~-2,-1,1:_~bu~ba ddh:-3.05524784074626e-18 C0~-2,-1,1:_~bu~ga ddh:2.64794548787695e-09 C0~-2,-1,1:_~bu~kha ddh:-7.08505145415428e-10 C0~-2,-1,1:_~bu~n TTI:-0.000398899396964083 C0~-2,-1,1:_~bu~na ddh:-3.72750083329056e-18 C0~-2,-1,1:_~ga~na TTI:-0.0145174035046015 C0~-2,-1,1:_~ga~ti njjo:-0.000135085406972062 47
C0~-2,-1,1:_~ku~li njjo:-0.0013669466430929 C0~-2,-1,1:_~la~ra njjo:-0.0306792118542371 C0~-2,-1,1:_~mu~la njjo:-0.00125303442283516 C0~-2,-1,1:a~ba~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~-2,-1,1:a~ko~. ddh:-2.83845604763285e-10 C0~-2,-1,1:a~li~d MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-2,-1,1:a~l~ra MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~-2,-1,1:a~m~la ddh:-1.46333064123832e-10 C0~-2,-1,1:a~n~r ddh:-2.39371510894065e-18 C0~-2,-1,1:a~n~ri ddh:-1.29615147987424e-10 C0~-2,-1,1:bhau~n~. TTI:-6.11241809240577e-05 C0~-2,-1,1:bha~do~. TTI:-0.000265091182345496 C0~-2,-1,1:g~va~. TTI:-0.000678453894341931 C0~-2,-1,1:ku~n~. TTI:-0.000330257461449756 C0~-2,-1,1:llu~n~r ddh:-2.50768773261406e-10 C0~-2,-1,1:ng~ha~. TTI:-0.000983122252755619 C0~-2,-1,1:ni~pa~. TTI:-0.000842676216647677 C0~-2,-1,1:ra~m~ra TTI:-0.00233978267051983 C0~-2,-1,1:r~ga~. njjo:-0.00650506598472048 C0~-2,-1,1:r~ku~a TTI:-0.000200245165257415 C0~-2,-1:_~a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.2727235499538e-19 C0~-2,-1:_~be TTI:-5.46414505266872e-05 C0~-2,-1:_~bha TTI:-0.000706083090522109 C0~-2,-1:_~bhi TTI:-0.000847158458970901 C0~-2,-1:_~bi njjo:0.051172445470808 C0~-2,-1:_~bu TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C0~-2,-1:_~e TTI:-0.00277669394409798 C0~-2,-1:_~ga TTI:-0.0145174035046015 njjo:-0.000135085406972062 C0~-2,-1:_~gi njjo:-0.00411007289510381 C0~-2,-1:_~ku njjo:-0.0013669466430929 C0~-2,-1:_~la njjo:-0.0306792118542371 C0~-2,-1:_~mo TTI:-0.00129966236834032 C0~-2,-1:_~mu njjo:-0.00125303442283516 C0~-2,-1:a~ba D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~-2,-1:a~ko ddh:-2.83845604763285e-10 C0~-2,-1:a~l MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~-2,-1:a~li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~-2,-1:a~m ddh:-1.46333064123832e-10 C0~-2,-1:a~ma MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C0~-2,-1:a~n ddh:-1.29615150381139e-10 48
C0~-2,-1:a~r TTI:-0.000739228146821998 C0~-2,-1:be~l TTI:-0.00405809811408058 C0~-2,-1:bhau~n TTI:-6.11241809240577e-05 C0~-2,-1:bha~do TTI:-0.000265091182345496 C0~-2,-1:bi~l TTI:-0.000131032481788732 C0~-2,-1:bi~rai TTI:-0.00143474387129284 C0~-2,-1:bu~ri TTI:-0.00703525043711547 C0~-2,-1:b~ri TTI:0.0731846358804615 C0~-2,-1:di~nga TTI:-0.00190506164886276 C0~-2,-1:ga~ba TTI:-0.000222592738708743 C0~-2,-1:ga~n TTI:-0.00287470534802889 C0~-2,-1:ga~s TTI:-0.00217452842494425 C0~-2,-1:g~va TTI:-0.000678453894341931 C0~-2,-1:ke~va TTI:-0.000447012591601141 C0~-2,-1:ku~n TTI:-0.000330257461449756 C0~-2,-1:llu~n ddh:-2.50768773261406e-10 C0~-2,-1:l~ba TTI:-0.002259878048601 C0~-2,-1:l~ka TTI:-0.00119232453714038 C0~-2,-1:ma~sa njjo:-0.000148244086889984 C0~-2,-1:mu~kh TTI:-3.2644782834012e-05 C0~-2,-1:m~ri TTI:-0.00766143234527358 C0~-2,-1:ng~ha TTI:-0.000983122252755619 C0~-2,-1:ni~pa TTI:-0.000842676216647677 C0~-2,-1:ni~ya njjo:-0.00697478417695654 C0~-2,-1:ra~j MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C0~-2,-1:ra~ku TTI:-0.00175342358677956 C0~-2,-1:ra~m TTI:-0.00233978267051983 C0~-2,-1:ri~a TTI:-0.00341006431381021 C0~-2,-1:r~ga njjo:-0.00650506598472048 C0~-2,-1:r~ka TTI:-0.000189209088741174 C0~-2,-1:r~ku TTI:-0.000200245165257415 C0~-2:_ Ba:-3.67737675462012e-11 TTI:-0.0217944424527916 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:8.10562606827128e-10 njjo:0.013628094248567 C0~-2:a D:-1.86970029414429e-11 MO:-5.72694583554103e-08 T:-8.87726942275319e-13 TTI:-0.000739228146821998 d:-1.86970029414429e-11 ddh:-5.59793819268256e-10 mO:5.72694583554103e-08 mo:-5.72694583554103e-08 C0~-2:b TTI:0.0731846358804615 C0~-2:be TTI:-0.00405809811408058 C0~-2:bha TTI:-0.00388182466406888 C0~-2:bhau TTI:-6.11241809240577e-05 C0~-2:bi TTI:-0.00156577635308157 C0~-2:bo TTI:-0.00110088935224245 C0~-2:bu TTI:-0.00703525043711547 C0~-2:di TTI:-0.00190506164886276 C0~-2:e TTI:-0.000123415793751178 C0~-2:g TTI:-0.000678453894341931 C0~-2:ga TTI:-0.00527182651168189 49
C0~-2:ke TTI:-0.000447012591601141 C0~-2:ku TTI:-0.000330257461449756 C0~-2:l TTI:-0.00345220258574139 C0~-2:llu ddh:-2.50768773261406e-10 C0~-2:m TTI:-0.00766143234527358 C0~-2:ma njjo:-0.000148244086889984 C0~-2:mu TTI:-3.2644782834012e-05 C0~-2:ng TTI:-0.000983122252755619 C0~-2:ni TTI:-0.000842676216647677 njjo:-0.00697478417695654 C0~-2:no TTI:-0.000147490387747709 C0~-2:r TTI:-0.00209445175640313 njjo:-0.00650506598472048 C0~-2:ra MO:-4.75586896143878e-08 TTI:-0.00409320625729938 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 C0~-2:ri TTI:-0.00341006431381021 C0~-2:t TTI:-0.00147468337913345 C0~0,1,2:ba~d~. Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~0,1,2:dha~.~_ ddh:-2.83845604763285e-10 C0~0,1,2:dha~la~. ddh:-1.46333064123832e-10 C0~0,1,2:dha~r~. ddh:-2.50768773261406e-10 C0~0,1,2:d~.~_ D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:1.86970029414429e-11 C0~0,1,2:mo~d~. MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C0~0,1,2:mo~ra~. MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C0~0,1,2:mo~ti~. MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C0~0,1,2:ti~.~_ TTI:-0.00744129542224498 C0~0,1,2:ti~dha~r TTI:-7.09462378154227e-05 C0~0,1,2:ti~m~. TTI:-0.002259878048601 C0~0,1,2:ti~ng~. TTI:-0.000111604940724642 C0~0,1,2:ti~pu~r TTI:-0.00247038227448224 C0~0,1,2:ti~ya~. TTI:-0.000131032481788732 C0~0,1:ba~d Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~0,1:dha~. ddh:-2.83845604763285e-10 C0~0,1:dha~ba ddh:-3.05524784074626e-18 C0~0,1:dha~ga ddh:2.64794548787695e-09 C0~0,1:dha~khe ddh:-5.82715939834801e-10 C0~0,1:dha~la ddh:-1.46333064123832e-10 C0~0,1:dha~na ddh:-3.72750083329056e-18 C0~0,1:dha~r ddh:-2.50768775655121e-10 C0~0,1:dha~ri ddh:-1.29615147987424e-10 C0~0,1:dha~u ddh:-3.2727235499538e-19 C0~0,1:d~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 C0~0,1:mo~d MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 50
C0~0,1:mo~ra MO:-4.86575717744618e-08 mO:-4.86575717744618e-08 mo:4.86575717744618e-08 C0~0,1:mo~ti MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C0~0,1:ti~. TTI:-0.00744129542224498 C0~0,1:ti~a TTI:-0.00156918911120731 C0~0,1:ti~ba TTI:-0.000123415793751178 C0~0,1:ti~dha TTI:-7.09462378154227e-05 C0~0,1:ti~ga TTI:-0.0017897139889239 C0~0,1:ti~k TTI:-0.00405809811408058 C0~0,1:ti~kha TTI:-0.00143474387129284 C0~0,1:ti~ko TTI:-0.00695106166046719 C0~0,1:ti~ku TTI:-0.000447012591601141 C0~0,1:ti~m TTI:-0.002259878048601 C0~0,1:ti~ma TTI:-0.00119390023876806 C0~0,1:ti~n TTI:-0.00128380525685815 C0~0,1:ti~na TTI:-0.0145174035046015 C0~0,1:ti~ng TTI:-0.000111604940724642 C0~0,1:ti~ppa TTI:-0.00175342358677956 C0~0,1:ti~pu TTI:-0.00247038227448224 C0~0,1:ti~ra TTI:-0.00233978267051983 C0~0,1:ti~va TTI:-7.76041480165313e-05 C0~0,1:ti~ya TTI:-0.000870260628610729 C0~0:ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 C0~0:d D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 C0~0:dha ddh:1.4172952320153e-17 C0~0:mo MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 C0~0:njo njjo:-8.40256683676266e-19 C0~0:ti TTI:-3.17806761809813e-18 C0~1,2:.~_ D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:0.00744129542224498 d:-1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:0.00650506598472048 C0~1,2:ba~r ddh:-3.05524784074626e-18 C0~1,2:ba~y TTI:-0.000123415793751178 C0~1,2:bhu~. TTI:-0.00341006431381021 C0~1,2:bu~ru ddh:-5.46161792072088e-10 C0~1,2:dha~r TTI:-7.09462378154227e-05 C0~1,2:d~. Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:3.67737675462012e-11 mO:-8.61188658094846e-09 mo:-8.61188658094846e-09 C0~1,2:ga~n TTI:-0.000597389451783517 C0~1,2:ga~ng TTI:-0.00119232453714038 C0~1,2:ga~ya ddh:2.64794548787695e-09 C0~1,2:ge~ri TTI:-5.46414505266872e-05 C0~1,2:h~ri njjo:-0.00697478417695654 C0~1,2:ja~la TTI:-0.000147490387747709 C0~1,2:kha~li ddh:-7.08505145415428e-10 51
C0~1,2:kha~na TTI:-0.00143474387129284 C0~1,2:khe~ra ddh:-5.82715939834801e-10 C0~1,2:ko~ppa TTI:-0.00277669394409798 C0~1,2:ko~r TTI:-0.00287470534802889 C0~1,2:ko~ra TTI:-0.00129966236834032 C0~1,2:ku~n TTI:-0.000447012591601141 C0~1,2:k~ri TTI:-0.00405809811408058 C0~1,2:lai~ya TTI:-0.00766143234527358 C0~1,2:la~. ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 C0~1,2:li~. njjo:-0.0013669466430929 C0~1,2:ma~li TTI:-0.00119390023876806 C0~1,2:m~. TTI:-0.002259878048601 C0~1,2:na~ko TTI:-0.0145174035046015 C0~1,2:na~la ddh:-3.72750083329056e-18 C0~1,2:ng~. TTI:-0.000111604940724642 C0~1,2:n~da TTI:-3.77474009231699e-05 C0~1,2:n~di TTI:-0.000847158458970901 C0~1,2:n~pa TTI:-0.000398899396964083 C0~1,2:ppa~. TTI:-0.00175342358677956 C0~1,2:pu~r TTI:-0.00247038227448224 C0~1,2:ra~. MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 njjo:-0.0306792118542371 C0~1,2:ra~di MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C0~1,2:ra~i njjo:0.051172445470808 C0~1,2:ra~k TTI:-0.00233978267051983 C0~1,2:ri~ri ddh:-1.29615147987424e-10 C0~1,2:r~. ddh:-2.50768773261406e-10 njjo:-0.000148244086889984 C0~1,2:r~ka TTI:-0.00190506164886276 ddh:-2.39371510894065e-18 C0~1,2:s~ta TTI:-0.00703525043711547 C0~1,2:s~wa TTI:-0.00217452842494425 C0~1,2:tha~ku njjo:-0.00411007289510381 C0~1,2:ti~. MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 C0~1,2:u~ra ddh:-3.2727235499538e-19 C0~1,2:va~. TTI:-7.76041480165313e-05 C0~1,2:ya~. TTI:-0.000131032481788732 C0~1,2:ya~ka TTI:-0.000739228146821998 C0~1:. D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:-1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 C0~1:a TTI:-0.00156918911120731 C0~1:ba TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 C0~1:bhu TTI:-0.00341006431381021 C0~1:bu ddh:-5.46161792072088e-10 C0~1:d Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 C0~1:dha TTI:-7.09462378154227e-05 C0~1:ga TTI:-0.0017897139889239 ddh:2.64794548787695e-09 52
C0~1:ge TTI:-5.46414505266872e-05 C0~1:h njjo:-0.00697478417695654 C0~1:ja TTI:-0.000147490387747709 C0~1:k TTI:-0.00405809811408058 C0~1:kha TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 C0~1:khe ddh:-5.82715939834801e-10 C0~1:ko TTI:-0.00695106166046719 C0~1:ku TTI:-0.000447012591601141 C0~1:la ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 C0~1:lai TTI:-0.00766143234527358 C0~1:li njjo:-0.0013669466430929 C0~1:m TTI:-0.002259878048601 C0~1:ma TTI:-0.00119390023876806 C0~1:n TTI:-0.00128380525685815 C0~1:na TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 C0~1:ng TTI:-0.000111604940724642 C0~1:ppa TTI:-0.00175342358677956 C0~1:pu TTI:-0.00247038227448224 C0~1:r TTI:-0.00190506164886276 ddh:-2.50768775655121e-10 njjo:0.000148244086889984 C0~1:ra MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 C0~1:ri ddh:-1.29615147987424e-10 C0~1:s TTI:-0.00920977886205972 C0~1:sh TTI:0.0731846358804615 C0~1:tha njjo:-0.00411007289510381 C0~1:ti MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 C0~1:u ddh:-3.2727235499538e-19 C0~1:va TTI:-7.76041480165313e-05 C0~1:ya TTI:-0.000870260628610729 C0~2:. Ba:-3.67737675462012e-11 MO:-8.06507050033476e-08 TTI:0.00774360751972067 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:3.67737675462012e-11 be:-3.67737675462012e-11 ddh:-3.97101837385238e-10 mO:8.06507050033476e-08 mo:-8.06507050033476e-08 njjo:-0.0335825224140272 C0~2:_ D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:-1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 C0~2:ba TTI:-3.2644782834012e-05 C0~2:co TTI:0.0731846358804615 C0~2:da TTI:-3.77474009231699e-05 C0~2:di MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 C0~2:hi TTI:-0.00136894394594989 C0~2:i njjo:0.051172445470808 C0~2:k TTI:-0.00233978267051983 C0~2:ka TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 C0~2:ko TTI:-0.0145174035046015 C0~2:ku njjo:-0.00411007289510381 C0~2:la TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 53
C0~2:li TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 C0~2:n TTI:-0.00104440204338466 C0~2:na TTI:-0.00143474387129284 C0~2:ng TTI:-0.00119232453714038 C0~2:pa TTI:-0.000398899396964083 C0~2:ppa TTI:-0.00277669394409798 C0~2:r TTI:-0.00541603386032655 ddh:-3.05524784074626e-18 C0~2:ra TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 C0~2:ri TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 C0~2:ru ddh:-5.46161792072088e-10 C0~2:su TTI:-0.000200245165257415 C0~2:ta TTI:-0.00703525043711547 C0~2:wa TTI:-0.00217452842494425 C0~2:y TTI:-0.000123415793751178 C0~2:ya TTI:-0.00766143234527358 ddh:2.64794548787695e-09 C1~-1,1,2:Ba~??~?? TTI:-0.000706083090522109 C1~-1,1,2:Bi~??~?? TTI:-0.000847158458970901 C1~-1,1,2:M TTI:-0.00233978267051983 ddh:-1.46333064123832e-10 C1~-1,1,2:N TTI:-0.00287470534802889 C1~-1,1,2:N~??~?? TTI:-6.11241809240577e-05 C1~-1,1,2:Ri TTI:-0.00766143234527358 C1~-1,1,2:Ri~??~?? TTI:0.0731846358804615 C1~-1,1,2:a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.2727235499538e-19 C1~-1,1,2:bA D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e11 C1~-1,1,2:ba~??~?? TTI:-0.00248247078730975 C1~-1,1,2:be~??~?? TTI:-5.46414505266872e-05 C1~-1,1,2:bi~??~?? njjo:0.051172445470808 C1~-1,1,2:bu~??~?? TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C1~-1,1,2:do~??~?? TTI:-0.000265091182345496 C1~-1,1,2:e TTI:-0.00277669394409798 C1~-1,1,2:ga TTI:-0.0145174035046015 njjo:-0.00664015139169255 C1~-1,1,2:gi njjo:-0.00411007289510381 C1~-1,1,2:go~??~?? TTI:-0.00224778953577349 C1~-1,1,2:ha~??~?? TTI:-0.000983122252755619 C1~-1,1,2:j MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C1~-1,1,2:ka TTI:-0.00138153362588156 C1~-1,1,2:kh TTI:-3.2644782834012e-05 C1~-1,1,2:ki TTI:-0.000123415793751178 C1~-1,1,2:ko ddh:-2.83845604763285e-10 C1~-1,1,2:ku TTI:-0.00195366875203697 njjo:-0.0013669466430929 C1~-1,1,2:l MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C1~-1,1,2:la njjo:-0.0306792118542371 54
C1~-1,1,2:li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C1~-1,1,2:l~??~?? TTI:-0.00418913059586931 C1~-1,1,2:mO TTI:-0.000655541358384558 C1~-1,1,2:ma MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C1~-1,1,2:mo TTI:-0.000644121009955766 C1~-1,1,2:mu njjo:-0.00125303442283516 C1~-1,1,2:n ddh:-3.80383923642545e-10 C1~-1,1,2:ngnga TTI:-0.00190506164886276 C1~-1,1,2:no~??~?? TTI:-0.00147468337913345 C1~-1,1,2:n~??~?? TTI:-0.000330257461449756 C1~-1,1,2:pa~??~?? TTI:-0.000842676216647677 C1~-1,1,2:p~??~?? TTI:-0.00136894394594989 C1~-1,1,2:r TTI:-0.000739228146821998 C1~-1,1,2:ri~??~?? TTI:-0.00703525043711547 C1~-1,1,2:ro~??~?? TTI:-0.00110088935224245 C1~-1,1,2:s TTI:-0.00217452842494425 C1~-1,1,2:sa njjo:-0.000148244086889984 C1~-1,1,2:va TTI:-0.000447012591601141 C1~-1,1,2:va~??~?? TTI:-0.000678453894341931 C1~-1,1,2:ya njjo:-0.00697478417695654 C1~-1,1,2:ya~??~?? TTI:-0.00341006431381021 C1~-1,1,2:yi TTI:-0.000147490387747709 C1~-1,1:Ba~?? TTI:-0.000706083090522109 C1~-1,1:Bi~?? TTI:-0.000847158458970901 C1~-1,1:M TTI:-0.00233978267051983 ddh:-1.46333064123832e-10 C1~-1,1:N TTI:-0.00287470534802889 C1~-1,1:N~?? TTI:-6.11241809240577e-05 C1~-1,1:Ri TTI:-0.00766143234527358 C1~-1,1:Ri~?? TTI:0.0731846358804615 C1~-1,1:a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:-3.2727235499538e-19 C1~-1,1:bA D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e11 C1~-1,1:ba~?? TTI:-0.00248247078730975 C1~-1,1:be~?? TTI:-5.46414505266872e-05 C1~-1,1:bi~?? njjo:0.051172445470808 C1~-1,1:bu~?? TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C1~-1,1:do~?? TTI:-0.000265091182345496 C1~-1,1:e TTI:-0.00277669394409798 C1~-1,1:ga TTI:-0.0145174035046015 njjo:-0.00664015139169255 C1~-1,1:gi njjo:-0.00411007289510381 C1~-1,1:go~?? TTI:-0.00224778953577349 C1~-1,1:ha~?? TTI:-0.000983122252755619 C1~-1,1:j MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C1~-1,1:ka TTI:-0.00138153362588156 C1~-1,1:kh TTI:-3.2644782834012e-05 55
C1~-1,1:ki TTI:-0.000123415793751178 C1~-1,1:ko ddh:-2.83845604763285e-10 C1~-1,1:ku TTI:-0.00195366875203697 njjo:-0.0013669466430929 C1~-1,1:l MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:2.41774429664504e-08 C1~-1,1:la njjo:-0.0306792118542371 C1~-1,1:li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C1~-1,1:l~?? TTI:-0.00418913059586931 C1~-1,1:mO TTI:-0.000655541358384558 C1~-1,1:ma MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C1~-1,1:mo TTI:-0.000644121009955766 C1~-1,1:mu njjo:-0.00125303442283516 C1~-1,1:n ddh:-3.80383923642545e-10 C1~-1,1:ngnga TTI:-0.00190506164886276 C1~-1,1:no~?? TTI:-0.00147468337913345 C1~-1,1:n~?? TTI:-0.000330257461449756 C1~-1,1:pa~?? TTI:-0.000842676216647677 C1~-1,1:p~?? TTI:-0.00136894394594989 C1~-1,1:r TTI:-0.000739228146821998 C1~-1,1:ri~?? TTI:-0.00703525043711547 C1~-1,1:ro~?? TTI:-0.00110088935224245 C1~-1,1:s TTI:-0.00217452842494425 C1~-1,1:sa njjo:-0.000148244086889984 C1~-1,1:va TTI:-0.000447012591601141 C1~-1,1:va~?? TTI:-0.000678453894341931 C1~-1,1:ya njjo:-0.00697478417695654 C1~-1,1:ya~?? TTI:-0.00341006431381021 C1~-1,1:yi TTI:-0.000147490387747709 C1~-1:Ba TTI:-0.000706083090522109 C1~-1:Bi TTI:-0.000847158458970901 C1~-1:M TTI:-0.00233978267051983 ddh:-1.46333064123832e-10 C1~-1:N TTI:-0.00293582952895295 C1~-1:Rai TTI:-0.00143474387129284 C1~-1:Ri TTI:0.0655232035351879 C1~-1:a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:-3.2727235499538e-19 C1~-1:bA D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 C1~-1:ba TTI:-0.00248247078730975 C1~-1:be TTI:-5.46414505266872e-05 C1~-1:bi njjo:0.051172445470808 C1~-1:bu TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C1~-1:do TTI:-0.000265091182345496 C1~-1:e TTI:-0.00277669394409798 C1~-1:ga TTI:-0.0145174035046015 njjo:-0.00664015139169255 C1~-1:gi njjo:-0.00411007289510381 C1~-1:go TTI:-0.00224778953577349 C1~-1:ha TTI:-0.000983122252755619 56
C1~-1:j MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 C1~-1:ka TTI:-0.00138153362588156 C1~-1:kh TTI:-3.2644782834012e-05 C1~-1:ki TTI:-0.000123415793751178 C1~-1:ko ddh:-2.83845604763285e-10 C1~-1:ku TTI:-0.00195366875203697 njjo:-0.0013669466430929 C1~-1:l MO:-2.41774429664504e-08 TTI:-0.00418913059586931 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 C1~-1:la njjo:-0.0306792118542371 C1~-1:li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:8.61188658094846e-09 C1~-1:mO TTI:-0.000655541358384558 C1~-1:ma MO:-2.44801288080114e-08 mO:-2.44801288080114e-08 mo:2.44801288080114e-08 C1~-1:mo TTI:-0.000644121009955766 C1~-1:mu njjo:-0.00125303442283516 C1~-1:n TTI:-0.000330257461449756 ddh:-3.80383923642545e-10 C1~-1:ngnga TTI:-0.00190506164886276 C1~-1:no TTI:-0.00147468337913345 C1~-1:p TTI:-0.00136894394594989 C1~-1:pa TTI:-0.000842676216647677 C1~-1:r TTI:-0.000739228146821998 C1~-1:ri TTI:-0.00703525043711547 C1~-1:ro TTI:-0.00110088935224245 C1~-1:s TTI:-0.00217452842494425 C1~-1:sa njjo:-0.000148244086889984 C1~-1:she TTI:-0.00170499750240454 C1~-1:va TTI:-0.00112546648594307 C1~-1:ya TTI:-0.00341006431381021 njjo:-0.00697478417695654 C1~-1:yi TTI:-0.000147490387747709 C1~-2,-1,1:??~Ba~?? TTI:-0.000706083090522109 C1~-2,-1,1:??~Bi~?? TTI:-0.000847158458970901 C1~-2,-1,1:??~a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.2727235499538e-19 C1~-2,-1,1:??~be~?? TTI:-5.46414505266872e-05 C1~-2,-1,1:??~bi~?? njjo:0.051172445470808 C1~-2,-1,1:??~bu~?? TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C1~-2,-1,1:??~e TTI:-0.00277669394409798 C1~-2,-1,1:??~ga TTI:-0.0145174035046015 njjo:-0.000135085406972062 C1~-2,-1,1:??~gi njjo:-0.00411007289510381 C1~-2,-1,1:??~ku njjo:-0.0013669466430929 C1~-2,-1,1:??~la njjo:-0.0306792118542371 C1~-2,-1,1:??~mO TTI:-0.000655541358384558 C1~-2,-1,1:??~mo TTI:-0.000644121009955766 C1~-2,-1,1:??~mu njjo:-0.00125303442283516 C1~-2,-1,1:Ba~do~?? TTI:-0.000265091182345496 57
C1~-2,-1,1:a D:-1.86970029414429e-11 MO:-5.72694583554103e-08 T:8.87726942275319e-13 TTI:-0.000739228146821998 d:-1.86970029414429e-11 ddh:5.59793819268256e-10 mO:-5.72694583554103e-08 mo:-5.72694583554103e-08 C1~-2,-1,1:be~l~?? TTI:-0.00405809811408058 C1~-2,-1,1:bi~l~?? TTI:-0.000131032481788732 C1~-2,-1,1:bu~ri~?? TTI:-0.00703525043711547 C1~-2,-1,1:b~Ri~?? TTI:0.0731846358804615 C1~-2,-1,1:di TTI:-0.00190506164886276 C1~-2,-1,1:e TTI:-0.000123415793751178 C1~-2,-1,1:ga TTI:-0.00504923377297314 C1~-2,-1,1:g~va~?? TTI:-0.000678453894341931 C1~-2,-1,1:ke TTI:-0.000447012591601141 C1~-2,-1,1:ku~n~?? TTI:-0.000330257461449756 C1~-2,-1,1:l TTI:-0.00119232453714038 C1~-2,-1,1:m TTI:-0.00766143234527358 C1~-2,-1,1:ma njjo:-0.000148244086889984 C1~-2,-1,1:mu TTI:-3.2644782834012e-05 C1~-2,-1,1:ni njjo:-0.00697478417695654 C1~-2,-1,1:ni~pa~?? TTI:-0.000842676216647677 C1~-2,-1,1:no TTI:-0.000147490387747709 C1~-2,-1,1:r TTI:-0.000389454253998589 njjo:-0.00650506598472048 C1~-2,-1,1:ra MO:-4.75586896143878e-08 TTI:-0.00409320625729938 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 C1~-2,-1,1:ri~ya~?? TTI:-0.00341006431381021 C1~-2,-1:??~Ba TTI:-0.000706083090522109 C1~-2,-1:??~Bi TTI:-0.000847158458970901 C1~-2,-1:??~a Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.2727235499538e-19 C1~-2,-1:??~be TTI:-5.46414505266872e-05 C1~-2,-1:??~bi njjo:0.051172445470808 C1~-2,-1:??~bu TTI:-0.00159279963573215 ddh:8.10562606827128e-10 C1~-2,-1:??~e TTI:-0.00277669394409798 C1~-2,-1:??~ga TTI:-0.0145174035046015 njjo:-0.000135085406972062 C1~-2,-1:??~gi njjo:-0.00411007289510381 C1~-2,-1:??~ku njjo:-0.0013669466430929 C1~-2,-1:??~la njjo:-0.0306792118542371 C1~-2,-1:??~mO TTI:-0.000655541358384558 C1~-2,-1:??~mo TTI:-0.000644121009955766 C1~-2,-1:??~mu njjo:-0.00125303442283516 C1~-2,-1:Ba~do TTI:-0.000265091182345496 C1~-2,-1:a D:-1.86970029414429e-11 MO:-5.72694583554103e-08 T:-8.87726942275319e13 TTI:-0.000739228146821998 d:-1.86970029414429e-11 ddh:-5.59793819268256e-10 mO:-5.72694583554103e-08 mo:-5.72694583554103e-08 C1~-2,-1:be~l TTI:-0.00405809811408058 C1~-2,-1:bi~l TTI:-0.000131032481788732 C1~-2,-1:bu~ri TTI:-0.00703525043711547 C1~-2,-1:b~Ri TTI:0.0731846358804615 C1~-2,-1:di TTI:-0.00190506164886276 58
C1~-2,-1:e TTI:-0.000123415793751178 C1~-2,-1:ga TTI:-0.00504923377297314 C1~-2,-1:g~va TTI:-0.000678453894341931 C1~-2,-1:ke TTI:-0.000447012591601141 C1~-2,-1:ku~n TTI:-0.000330257461449756 C1~-2,-1:l TTI:-0.00119232453714038 C1~-2,-1:m TTI:-0.00766143234527358 C1~-2,-1:ma njjo:-0.000148244086889984 C1~-2,-1:mu TTI:-3.2644782834012e-05 C1~-2,-1:ni njjo:-0.00697478417695654 C1~-2,-1:ni~pa TTI:-0.000842676216647677 C1~-2,-1:no TTI:-0.000147490387747709 C1~-2,-1:r TTI:-0.000389454253998589 njjo:-0.00650506598472048 C1~-2,-1:ra MO:-4.75586896143878e-08 TTI:-0.00409320625729938 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 C1~-2,-1:ri~ya TTI:-0.00341006431381021 C1~-2:?? Ba:-3.67737675462012e-11 TTI:-0.0217944424527916 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:8.10562606827128e-10 njjo:0.013628094248567 C1~-2:Ba TTI:-0.00388182466406888 C1~-2:Bau TTI:-6.11241809240577e-05 C1~-2:a D:-1.86970029414429e-11 MO:-5.72694583554103e-08 T:-8.87726942275319e-13 TTI:-0.000739228146821998 d:-1.86970029414429e-11 ddh:-5.59793819268256e-10 mO:5.72694583554103e-08 mo:-5.72694583554103e-08 C1~-2:b TTI:0.0731846358804615 C1~-2:be TTI:-0.00405809811408058 C1~-2:bi TTI:-0.00156577635308157 C1~-2:bo TTI:-0.00110088935224245 C1~-2:bu TTI:-0.00703525043711547 C1~-2:di TTI:-0.00190506164886276 C1~-2:e TTI:-0.000123415793751178 C1~-2:g TTI:-0.000678453894341931 C1~-2:ga TTI:-0.00527182651168189 C1~-2:ke TTI:-0.000447012591601141 C1~-2:ku TTI:-0.000330257461449756 C1~-2:l TTI:-0.00345220258574139 C1~-2:m TTI:-0.00766143234527358 C1~-2:ma njjo:-0.000148244086889984 C1~-2:mu TTI:-3.2644782834012e-05 C1~-2:ngng TTI:-0.000983122252755619 C1~-2:ni TTI:-0.000842676216647677 njjo:-0.00697478417695654 C1~-2:no TTI:-0.000147490387747709 C1~-2:r TTI:-0.00209445175640313 njjo:-0.00650506598472048 C1~-2:ra MO:-4.75586896143878e-08 TTI:-0.00409320625729938 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 C1~-2:ri TTI:-0.00341006431381021 C1~-2:th TTI:-0.00147468337913345 C1~1,2:??~?? Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:1.04828147969798e-07 T:-8.87726942275319e-13 TTI:-3.17806761809813e-18 bA:59
3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:3.67737675462012e-11 d:-1.86970029414429e-11 ddh:1.4172952320153e-17 mO:1.04828147969798e-07 mo:-1.04828147969798e-07 njjo:-8.40256683676266e-19 Swn:. Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-1.04828147969798e-07 T:-8.87726942275319e-13 TTI:-3.17806761809813e-18 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 d:1.86970029414429e-11 ddh:1.4172952320153e-17 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 njjo:-8.40256683676266e-19 k0:Ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 k0:D~D D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 k0:MO MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:-1.04828147969798e07 k0:TTI~TTi TTI:-3.17806761809813e-18 k0:ddh~dhA~dhA ddh:1.4172952320153e-17 k0:njco njjo:-8.40256683676266e-19 k1:.~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 k1:A TTI:-0.00156918911120731 k1:BU~Bu~Bu TTI:-0.00341006431381021 k1:Ba TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 k1:D~D Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 k1:LA ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 k1:Lai TTI:-0.00766143234527358 k1:Li~Li njjo:-0.0013669466430929 k1:Mg~Mg TTI:-0.000111604940724642 k1:M~M TTI:-0.002259878048601 k1:NA TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 k1:N~N TTI:-0.00128380525685815 k1:R TTI:-0.00190506164886276 ddh:-2.50768775655121e-10 njjo:0.000148244086889984 k1:RA MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:-4.86575717744618e08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 k1:RI ddh:-1.29615147987424e-10 k1:S TTI:-0.00920977886205972 k1:S~S TTI:0.0731846358804615 k1:THA njjo:-0.00411007289510381 k1:TTI~TTi MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 k1:^ ddh:-3.2727235499538e-19 njjo:-0.00697478417695654 k1:bU ddh:-5.46161792072088e-10 k1:ddh~dhA~dhA TTI:-7.09462378154227e-05 k1:g TTI:-5.46414505266872e-05 k1:gA TTI:-0.0017897139889239 ddh:2.64794548787695e-09 k1:gha TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 k1:jA TTI:-0.000147490387747709 k1:k TTI:-0.000447012591601141 60
k1:kO~kO TTI:-0.00695106166046719 k1:khE ddh:-5.82715939834801e-10 k1:k~k TTI:-0.00405809811408058 k1:mA TTI:-0.00119390023876806 k1:pU TTI:-0.00247038227448224 k1:ppA TTI:-0.00175342358677956 k1:vA~vA TTI:-7.76041480165313e-05 k1:yA TTI:-0.000902905411444741 k2:.~. Ba:-3.67737675462012e-11 MO:-8.06507050033476e-08 TTI:-0.00774360751972067 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:3.67737675462012e-11 ddh:-3.97101837385238e-10 mO:-8.06507050033476e-08 mo:8.06507050033476e-08 njjo:-0.0335825224140272 k2:?? D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 k2:Ba TTI:-3.2644782834012e-05 k2:DA TTI:-3.77474009231699e-05 k2:Da MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:-2.41774429664504e08 mo:-2.41774429664504e-08 k2:I njjo:0.051172445470808 k2:LA TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 k2:Li~Li TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 k2:Mg~Mg TTI:-0.00119232453714038 k2:NA TTI:-0.00143474387129284 k2:N~N TTI:-0.00104440204338466 k2:R TTI:-0.00541603386032655 ddh:-3.05524784074626e-18 k2:RA TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 k2:RI TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:-0.00697478417695654 k2:Ra ddh:-5.46161792072088e-10 k2:TA TTI:-0.00703525043711547 k2:^ TTI:-0.000123415793751178 k2:hI TTI:-0.00136894394594989 k2:k njjo:-0.00411007289510381 k2:kA TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 k2:kO TTI:0.0731846358804615 k2:kO~kO TTI:-0.0145174035046015 k2:k~k TTI:-0.00233978267051983 k2:pA~pA TTI:-0.000398899396964083 k2:ppA TTI:-0.00277669394409798 k2:s TTI:-0.000200245165257415 k2:vA~vA TTI:-0.00217452842494425 k2:yA TTI:-0.00766143234527358 ddh:2.64794548787695e-09 m0~Ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~D D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 m0~D:1 D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 m0~MO MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~T D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 m0~TTI:1 TTI:-3.17806761809813e-18 61
m0~TTi TTI:-3.17806761809813e-18 m0~Ti TTI:-3.17806761809813e-18 m0~Ti:1 TTI:-3.17806761809813e-18 m0~bA Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~bA:1 Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~bE Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~ba Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~ba:1 Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~be Ba:-3.67737675462012e-11 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 m0~d D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 m0~d:1 D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 m0~ddh:1 ddh:1.4172952320153e-17 m0~dhA ddh:1.4172952320153e-17 m0~dhA:1 ddh:1.4172952320153e-17 m0~dha ddh:1.4172952320153e-17 m0~dha:1 ddh:1.4172952320153e-17 m0~mO MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~mO:1 MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~mU MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~ma MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:-1.04828147969798e07 m0~mo MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~mo:1 MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 m0~njco njjo:-8.40256683676266e-19 m0~njjo:1 njjo:-8.40256683676266e-19 m0~njnja njjo:-8.40256683676266e-19 m0~njnjo njjo:-8.40256683676266e-19 m0~th TTI:-3.17806761809813e-18 m0~thI TTI:-3.17806761809813e-18 m0~thi TTI:-3.17806761809813e-18 m0~thi:1 TTI:-3.17806761809813e-18 m0~ththi TTI:-3.17806761809813e-18 m0~ti TTI:-3.17806761809813e-18 m0~ti:1 TTI:-3.17806761809813e-18 m1~. D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 m1~.:1 D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:-1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 62
m1~A TTI:-0.00156918911120731 m1~BU:1 TTI:-0.00341006431381021 m1~Ba TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 m1~Bu TTI:-0.00341006431381021 m1~Bu:1 TTI:-0.00341006431381021 m1~D Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 m1~D:1 Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 m1~LA ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~La ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~La:1 ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~Lai TTI:-0.00766143234527358 m1~Li njjo:-0.0013669466430929 m1~Li:1 njjo:-0.0013669466430929 m1~M TTI:-0.002259878048601 m1~M:1 TTI:-0.002259878048601 m1~Mg TTI:-0.000111604940724642 m1~Mg:1 TTI:-0.000111604940724642 m1~N TTI:-0.00128380525685815 m1~N:1 TTI:-0.00128380525685815 m1~NA TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~Na TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~Na:1 TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~R TTI:-0.00190506164886276 ddh:-2.50768775655121e-10 njjo:0.000148244086889984 m1~RA MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~RE MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~RI ddh:-1.29615147987424e-10 m1~Ra MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:-4.86575717744618e08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~Ri ddh:-1.29615147987424e-10 m1~Ri:1 ddh:-1.29615147987424e-10 m1~S TTI:0.0639748570184017 m1~S:1 TTI:0.0731846358804615 m1~T Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 m1~THA njjo:-0.00411007289510381 m1~THa njjo:-0.00411007289510381 m1~TTI:1 MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~TTi MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~Ta njjo:-0.00411007289510381 63
m1~ThA njjo:-0.00411007289510381 m1~Tha njjo:-0.00411007289510381 m1~Ti MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 m1~Ti:1 MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~^ TTI:-0.010778967973267 ddh:-3.2727235499538e-19 njjo:-0.00697478417695654 m1~a TTI:-0.00156918911120731 m1~bA TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 m1~bA:1 TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 m1~bE TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 m1~bU ddh:-5.46161792072088e-10 m1~ba TTI:-0.000123415793751178 ddh:-5.46161795127336e-10 m1~ba:1 TTI:-0.000123415793751178 ddh:-5.46161795127336e-10 m1~be TTI:-0.000123415793751178 ddh:-3.05524784074626e-18 m1~bu ddh:-5.46161792072088e-10 m1~bu:1 ddh:-5.46161792072088e-10 m1~byu ddh:-5.46161792072088e-10 m1~byu:1 ddh:-5.46161792072088e-10 m1~d Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 m1~d:1 Ba:-3.67737675462012e-11 MO:-8.61188658094846e-09 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 m1~da njjo:-0.00411007289510381 m1~ddh:1 TTI:-7.09462378154227e-05 m1~dhA TTI:-7.09462378154227e-05 m1~dhA:1 TTI:-7.09462378154227e-05 m1~dha TTI:-7.09462378154227e-05 m1~dha:1 TTI:-7.09462378154227e-05 m1~g TTI:-5.46414505266872e-05 m1~gA TTI:-0.0017897139889239 ddh:2.64794548787695e-09 m1~gE TTI:-0.00184435543945059 ddh:2.64794548787695e-09 m1~gE:1 TTI:-5.46414505266872e-05 m1~ga TTI:-0.00184435543945059 ddh:2.64794548787695e-09 m1~ga:1 TTI:-0.0017897139889239 ddh:2.64794548787695e-09 m1~ge TTI:-5.46414505266872e-05 m1~ge:1 TTI:-5.46414505266872e-05 m1~gha TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 m1~h njjo:-0.00697478417695654 m1~h:1 njjo:-0.00697478417695654 m1~j TTI:-5.46414505266872e-05 m1~jA TTI:-0.000147490387747709 m1~jE TTI:-0.000147490387747709 m1~ja TTI:-0.000147490387747709 m1~ja:1 TTI:-0.000147490387747709 m1~k TTI:-0.00450511070568172 m1~k:1 TTI:-0.00405809811408058 64
m1~kO TTI:-0.00695106166046719 m1~kO:1 TTI:-0.00695106166046719 m1~kU TTI:-0.000447012591601141 m1~ka TTI:-0.0114561723661489 m1~khA TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 m1~khE TTI:-0.00143474387129284 ddh:-1.29122108525023e-09 m1~kha TTI:-0.00143474387129284 ddh:-1.29122108525023e-09 m1~kha:1 TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 m1~khe ddh:-5.82715939834801e-10 m1~khe:1 ddh:-5.82715939834801e-10 m1~kk TTI:-0.00405809811408058 m1~kkO TTI:-0.00695106166046719 m1~kku TTI:-0.000447012591601141 m1~ko TTI:-0.00695106166046719 m1~ko:1 TTI:-0.00695106166046719 m1~ku TTI:-0.000447012591601141 m1~ku:1 TTI:-0.000447012591601141 m1~lA ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~lE ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~lI njjo:-0.0013669466430929 m1~la ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~la:1 ddh:-1.46333064123832e-10 njjo:-0.00125303442283516 m1~lai TTI:-0.00766143234527358 njjo:-0.0013669466430929 m1~lai:1 TTI:-0.00766143234527358 m1~li njjo:-0.0013669466430929 m1~li:1 njjo:-0.0013669466430929 m1~m TTI:-0.002259878048601 m1~m:1 TTI:-0.002259878048601 m1~mA TTI:-0.00119390023876806 m1~mE TTI:-0.00119390023876806 m1~ma TTI:-0.00345377828736907 m1~ma:1 TTI:-0.00119390023876806 m1~n TTI:-0.00128380525685815 m1~n:1 TTI:-0.00128380525685815 m1~nA TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~nA:1 TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~na TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~na:1 TTI:-0.0145174035046015 ddh:-3.72750083329056e-18 m1~ng TTI:-0.0013954101975828 m1~ng:1 TTI:-0.00128380525685815 m1~ngng TTI:-0.000111604940724642 m1~ngng:1 TTI:-0.000111604940724642 m1~nj TTI:-0.00128380525685815 m1~pU TTI:-0.00247038227448224 m1~ppA TTI:-0.00175342358677956 m1~ppU TTI:-0.00247038227448224 m1~ppa TTI:-0.00175342358677956 m1~ppa:1 TTI:-0.00175342358677956 m1~pu TTI:-0.00247038227448224 65
m1~pu:1 TTI:-0.00247038227448224 m1~r TTI:-0.00190506164886276 ddh:-2.50768775655121e-10 njjo:0.000148244086889984 m1~r:1 TTI:-0.00190506164886276 ddh:-2.50768775655121e-10 njjo:0.000148244086889984 m1~rA MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:-4.86575717744618e08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~rA:1 MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~rE MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:-4.86575717744618e08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~ra MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:-4.86575717744618e08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~ra:1 MO:-4.86575717744618e-08 TTI:-0.00233978267051983 mO:4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 m1~ri ddh:-1.29615147987424e-10 m1~ri:1 ddh:-1.29615147987424e-10 m1~s TTI:-0.00920977886205972 m1~s:1 TTI:-0.00920977886205972 m1~sh TTI:0.0639748570184017 m1~sh:1 TTI:0.0731846358804615 m1~th MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 m1~thA njjo:-0.00411007289510381 m1~thI MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 m1~tha njjo:-0.00411007289510381 m1~tha:1 njjo:-0.00411007289510381 m1~thi MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 m1~thi:1 MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~ththa njjo:-0.00411007289510381 m1~ththi MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~ti MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:-4.75586896143878e08 njjo:-0.000135085406972062 m1~ti:1 MO:-4.75586896143878e-08 mO:-4.75586896143878e-08 mo:4.75586896143878e-08 njjo:-0.000135085406972062 m1~u ddh:-3.2727235499538e-19 m1~u:1 ddh:-3.2727235499538e-19 m1~vA TTI:-7.76041480165313e-05 m1~vA:1 TTI:-7.76041480165313e-05 m1~vE TTI:-7.76041480165313e-05 m1~va TTI:-0.00164679325922384 m1~va:1 TTI:-7.76041480165313e-05 m1~vaa TTI:-7.76041480165313e-05 m1~vva TTI:-7.76041480165313e-05 m1~y TTI:-0.00156918911120731 66
m1~yA TTI:-0.00247209452265205 m1~yA:1 TTI:-0.00156918911120731 m1~yU ddh:-3.2727235499538e-19 m1~ya TTI:-0.00243944973981803 m1~ya:1 TTI:-0.00243944973981803 m1~yu ddh:-3.2727235499538e-19 m1~yu:1 ddh:-3.2727235499538e-19 m1~yyA TTI:-0.000870260628610729 m2~. Ba:-3.67737675462012e-11 MO:-8.06507050033476e-08 TTI:-0.00774360751972067 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:3.67737675462012e-11 ddh:-3.97101837385238e-10 mO:-8.06507050033476e-08 mo:8.06507050033476e-08 njjo:-0.0335825224140272 m2~.:1 Ba:-3.67737675462012e-11 MO:-8.06507050033476e-08 TTI:0.00774360751972067 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:3.67737675462012e-11 be:-3.67737675462012e-11 ddh:-3.97101837385238e-10 mO:8.06507050033476e-08 mo:-8.06507050033476e-08 njjo:-0.0335825224140272 m2~??:1 D:-1.86970029414429e-11 T:-8.87726942275319e-13 TTI:-0.00744129542224498 d:-1.86970029414429e-11 ddh:-2.83845604763285e-10 njjo:-0.00650506598472048 m2~Ba TTI:-3.2644782834012e-05 m2~DA TTI:-3.77474009231699e-05 m2~Da MO:-2.41774429664504e-08 TTI:-0.000884905859894071 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~Da:1 TTI:-3.77474009231699e-05 m2~Di MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~Di:1 MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~I njjo:0.051172445470808 m2~LA TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~La TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~La:1 TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~Li TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~Li:1 TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~Mg TTI:-0.00119232453714038 m2~Mg:1 TTI:-0.00119232453714038 m2~N TTI:-0.00104440204338466 m2~N:1 TTI:-0.00104440204338466 m2~NA TTI:-0.00143474387129284 m2~Na TTI:-0.00143474387129284 m2~Na:1 TTI:-0.00143474387129284 m2~R TTI:-0.00541603386032655 ddh:-3.05524784074626e-18 m2~RA TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 m2~RE TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 m2~RI TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 m2~Ra TTI:-0.00129966236834032 ddh:-1.12887773223416e-09 m2~Ri TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 67
m2~Ri:1 TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 m2~Ru ddh:-5.46161792072088e-10 m2~Ru:1 ddh:-5.46161792072088e-10 m2~TA TTI:-0.00703525043711547 m2~TTA TTI:-0.00703525043711547 m2~TTa TTI:-0.00703525043711547 m2~TTa:1 TTI:-0.00703525043711547 m2~Ta TTI:-0.00707299783803864 m2~Tha TTI:-0.00703525043711547 m2~Ti MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~^ TTI:-0.000123415793751178 m2~ai TTI:-0.000123415793751178 njjo:0.051172445470808 m2~ai:1 njjo:0.051172445470808 m2~bA TTI:-3.2644782834012e-05 m2~bA:1 TTI:-3.2644782834012e-05 m2~bE TTI:-3.2644782834012e-05 m2~ba TTI:-3.2644782834012e-05 m2~ba:1 TTI:-3.2644782834012e-05 m2~be TTI:-3.2644782834012e-05 m2~dA TTI:-3.77474009231699e-05 m2~dI MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~da TTI:-3.77474009231699e-05 m2~da:1 TTI:-3.77474009231699e-05 m2~di MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~di:1 MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~hI TTI:-0.00136894394594989 m2~hai TTI:-0.00136894394594989 m2~hi TTI:-0.00136894394594989 m2~hi:1 TTI:-0.00136894394594989 m2~i TTI:-0.000123415793751178 njjo:0.051172445470808 m2~i:1 TTI:-0.000123415793751178 njjo:0.051172445470808 m2~k TTI:-0.00233978267051983 njjo:-0.00411007289510381 m2~k:1 TTI:-0.00233978267051983 m2~kA TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 m2~kO TTI:0.05866723237586 m2~kO:1 TTI:-0.0145174035046015 m2~kU njjo:-0.00411007289510381 m2~ka TTI:0.0536831599096554 ddh:-2.39371510894065e-18 njjo:-0.00411007289510381 m2~ka:1 TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 m2~kk TTI:-0.00233978267051983 m2~kkO TTI:0.05866723237586 m2~kka TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 m2~kku njjo:-0.00411007289510381 m2~ko TTI:0.05866723237586 68
m2~ko:1 TTI:0.05866723237586 m2~ku TTI:0.0731846358804615 njjo:-0.00411007289510381 m2~ku:1 njjo:-0.00411007289510381 m2~lA TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~lE TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~lI TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~la TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~la:1 TTI:-0.000147490387747709 ddh:-3.72750083329056e-18 m2~lai TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~li TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~li:1 TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 m2~n TTI:-0.00104440204338466 m2~n:1 TTI:-0.00104440204338466 m2~nA TTI:-0.00143474387129284 m2~nA:1 TTI:-0.00143474387129284 m2~na TTI:-0.00143474387129284 m2~na:1 TTI:-0.00143474387129284 m2~ng TTI:-0.00223672658052504 m2~ng:1 TTI:-0.00104440204338466 m2~ngng TTI:-0.00119232453714038 m2~ngng:1 TTI:-0.00119232453714038 m2~nj TTI:-0.00104440204338466 m2~pA TTI:-0.000398899396964083 m2~pA:1 TTI:-0.000398899396964083 m2~pa TTI:-0.000398899396964083 m2~pa:1 TTI:-0.000398899396964083 m2~ppA TTI:-0.00317559334106206 m2~ppa TTI:-0.00277669394409798 m2~ppa:1 TTI:-0.00277669394409798 m2~r TTI:-0.00541603386032655 ddh:-5.46161795127336e-10 m2~r:1 TTI:-0.00541603386032655 ddh:-3.05524784074626e-18 m2~rA TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 m2~rA:1 TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 m2~rE TTI:-0.00129966236834032 ddh:-5.82715940162074e-10 m2~rU ddh:-5.46161792072088e-10 m2~ra TTI:-0.00129966236834032 ddh:-1.12887773223416e-09 m2~ra:1 TTI:-0.00129966236834032 ddh:-1.12887773223416e-09 m2~ri TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 m2~ri:1 TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 m2~ru ddh:-5.46161792072088e-10 m2~ru:1 ddh:-5.46161792072088e-10 m2~s TTI:-0.000200245165257415 m2~sU TTI:-0.000200245165257415 m2~sa TTI:-0.000200245165257415 m2~su TTI:-0.000200245165257415 m2~su:1 TTI:-0.000200245165257415 m2~tA TTI:-0.00703525043711547 69
m2~tE TTI:-0.00703525043711547 m2~ta TTI:-0.00703525043711547 m2~ta:1 TTI:-0.00703525043711547 m2~th MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 m2~thA TTI:-0.00703525043711547 m2~tha TTI:-0.00703525043711547 m2~tha:1 TTI:-0.00703525043711547 m2~ththA TTI:-0.00703525043711547 m2~vA TTI:-0.00217452842494425 m2~vA:1 TTI:-0.00217452842494425 m2~va TTI:-0.00217452842494425 m2~va:1 TTI:-0.00217452842494425 m2~vi njjo:0.051172445470808 m2~y TTI:-0.000123415793751178 njjo:0.051172445470808 m2~y:1 TTI:-0.000123415793751178 m2~yA TTI:-0.00766143234527358 ddh:2.64794548787695e-09 m2~ya TTI:-0.00766143234527358 ddh:2.64794548787695e-09 njjo:0.051172445470808 m2~ya:1 TTI:-0.00766143234527358 ddh:2.64794548787695e-09 m2~yi njjo:0.051172445470808 m2~yi:1 njjo:0.051172445470808 m2~yyA TTI:-0.00766143234527358 ddh:2.64794548787695e-09 ~?? Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-1.04828147969798e-07 T:8.87726942275319e-13 TTI:-0.0403742643353437 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 d:1.86970029414429e-11 ddh:-8.10562592856934e-10 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 njjo:-0.051172445470808 ~??~?? Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-1.04828147969798e-07 T:-8.87726942275319e-13 TTI:-0.0403742643353437 bA:-3.67737675462012e-11 bE:3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 d:1.86970029414429e-11 ddh:-8.10562592856934e-10 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 njjo:-0.051172445470808 ~Da~Da TTI:-3.77474009231699e-05 ~Di~Di MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 ~La~La TTI:-0.000147490387747709 ddh:-1.46333067851333e-10 njjo:0.00125303442283516 ~M TTI:-0.00233978267051983 ddh:-1.46333064123832e-10 ~N TTI:-0.00287470534802889 ~Na~Na TTI:-0.0159521473758943 ddh:-3.72750083329056e-18 ~RE MO:-4.86575717744618e-08 TTI:-0.00363944503886015 ddh:-5.82715940162074e-10 mO:-4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 ~Ra MO:-4.86575717744618e-08 TTI:-0.00363944503886015 ddh:-5.82715940162074e-10 mO:-4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 ~Ri TTI:-0.00766143234527358 ~Ri~Ri TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:0.00697478417695654 ~Ru~Ru ddh:-5.46161792072088e-10 70
~T Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-8.61188658094846e-09 T:8.87726942275319e-13 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:3.67737675462012e-11 be:-3.67737675462012e-11 d:-1.86970029414429e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 ~THa njjo:-0.00411007289510381 ~TTA TTI:-0.00703525043711547 ~TTa~TTa TTI:-0.00703525043711547 ~Ta TTI:-0.00707299783803864 njjo:-0.00411007289510381 ~ThA njjo:-0.00411007289510381 ~Tha TTI:-0.00703525043711547 njjo:-0.00411007289510381 ~Ti MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:-2.41774429664504e-08 mo:-2.41774429664504e-08 ~Ti~Ti MO:-4.75586896143878e-08 TTI:-3.17806761809813e-18 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 njjo:-0.000135085406972062 ~^ TTI:-0.010778967973267 ~a TTI:-0.00156918911120731 ~ai TTI:-0.000123415793751178 ~ai~ai njjo:0.051172445470808 ~bA D:-1.86970029414429e-11 T:-8.87726942275319e-13 d:-1.86970029414429e-11 ~bA~bA Ba:-3.67737675462012e-11 TTI:-0.00015606057658519 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.05524784074626e-18 ~bE Ba:-3.67737675462012e-11 TTI:-0.00015606057658519 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.05524784074626e-18 ~ba~ba Ba:-3.67737675462012e-11 TTI:-0.00015606057658519 bA:-3.67737675462012e11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:5.46161795127336e-10 ~be Ba:-3.67737675462012e-11 TTI:-0.00015606057658519 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:-3.67737675462012e-11 be:-3.67737675462012e-11 ddh:3.05524784074626e-18 ~bu~bu ddh:-5.46161792072088e-10 ~byu~byu ddh:-5.46161792072088e-10 ~dA TTI:-3.77474009231699e-05 ~dI MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:-2.41774429664504e-08 mo:-2.41774429664504e-08 ~da njjo:-0.00411007289510381 ~da~da TTI:-3.77474009231699e-05 ~dha~dha TTI:-7.09462378154227e-05 ddh:1.4172952320153e-17 ~di~di MO:-2.41774429664504e-08 TTI:-0.000847158458970901 mO:2.41774429664504e-08 mo:-2.41774429664504e-08 ~d~d Ba:-3.67737675462012e-11 D:-1.86970029414429e-11 MO:-8.61188658094846e-09 T:-8.87726942275319e-13 bA:-3.67737675462012e-11 bE:-3.67737675462012e-11 ba:3.67737675462012e-11 be:-3.67737675462012e-11 d:-1.86970029414429e-11 mO:8.61188658094846e-09 mo:-8.61188658094846e-09 ~gE TTI:-0.0017897139889239 ddh:2.64794548787695e-09 ~gE~gE TTI:-5.46414505266872e-05 ~ga TTI:-5.46414505266872e-05 njjo:-0.00650506598472048 ~ga~ga TTI:-0.0017897139889239 ddh:2.64794548787695e-09 71
~ge~ge TTI:-5.46414505266872e-05 ~hai TTI:-0.00136894394594989 ~hi~hi TTI:-0.00136894394594989 ~h~h njjo:-0.00697478417695654 ~i~i TTI:-0.000123415793751178 njjo:0.051172445470808 ~j MO:-4.75586896143878e-08 TTI:-5.46414505266872e-05 mO:-4.75586896143878e-08 mo:-4.75586896143878e-08 ~jE TTI:-0.000147490387747709 ~ja~ja TTI:-0.000147490387747709 ~kU TTI:-0.000447012591601141 njjo:-0.00411007289510381 ~ka TTI:0.0434897437133097 njjo:-0.00411007289510381 ~ka~ka TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 ~kh TTI:-3.2644782834012e-05 ~khA TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 ~khE TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 ~kha ddh:-5.82715939834801e-10 ~kha~kha TTI:-0.00143474387129284 ddh:-7.08505145415428e-10 ~khe~khe ddh:-5.82715939834801e-10 ~ki TTI:-0.000123415793751178 ~kk TTI:-0.0063978807846004 ~kkO TTI:0.0517161707153928 ~kka TTI:-0.00264428979568476 ddh:-2.39371510894065e-18 ~kku TTI:-0.000447012591601141 njjo:-0.00411007289510381 ~ko ddh:-2.83845604763285e-10 ~ko~ko TTI:0.0517161707153928 ~ku TTI:0.0712309671284245 ~ku~ku TTI:-0.000447012591601141 njjo:-0.00411007289510381 ~l MO:-2.41774429664504e-08 mO:-2.41774429664504e-08 mo:-2.41774429664504e-08 ~lA TTI:-0.000147490387747709 ddh:-1.46333067851333e-10 njjo:-0.00125303442283516 ~lE TTI:-0.000147490387747709 ddh:-1.46333067851333e-10 njjo:-0.00125303442283516 ~lI TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 njjo:-0.0013669466430929 ~lai TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 njjo:-0.0013669466430929 ~lai~lai TTI:-0.00766143234527358 ~la~la TTI:-0.000147490387747709 ddh:-1.46333067851333e-10 njjo:0.00125303442283516 ~li MO:-8.61188658094846e-09 mO:-8.61188658094846e-09 mo:-8.61188658094846e-09 ~li~li TTI:-0.00119390023876806 ddh:-7.08505145415428e-10 njjo:-0.0013669466430929 ~mE TTI:-0.00119390023876806 ~mO~mO MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 ~mU MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:-1.04828147969798e-07 ~ma MO:-1.04828147969798e-07 TTI:-0.002259878048601 mO:-1.04828147969798e-07 mo:-1.04828147969798e-07 ~ma~ma TTI:-0.00119390023876806 ~mo~mo MO:-1.04828147969798e-07 mO:-1.04828147969798e-07 mo:1.04828147969798e-07 ~m~m TTI:-0.002259878048601 ~n ddh:-3.80383923642545e-10 ~nA~nA TTI:-0.0159521473758943 ddh:-3.72750083329056e-18 72
~na~na TTI:-0.0159521473758943 ddh:-3.72750083329056e-18 ~ng TTI:-0.00130392947786503 ~ngnga TTI:-0.00190506164886276 ~ngng~ngng TTI:-0.00130392947786503 ~ng~ng TTI:-0.00232820730024281 ~nj TTI:-0.00232820730024281 ~njjo~njnja njjo:-8.40256683676266e-19 ~njnjo njjo:-8.40256683676266e-19 ~n~n TTI:-0.00232820730024281 ~pa~pa TTI:-0.000398899396964083 ~ppA TTI:-0.000398899396964083 ~ppU TTI:-0.00247038227448224 ~ppa~ppa TTI:-0.00453011753087753 ~pu~pu TTI:-0.00247038227448224 ~r TTI:-0.000739228146821998 ddh:-5.46161792072088e-10 ~rA~rA MO:-4.86575717744618e-08 TTI:-0.00363944503886015 ddh:5.82715940162074e-10 mO:-4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 ~rE MO:-4.86575717744618e-08 TTI:-0.00363944503886015 ddh:-5.82715940162074e-10 mO:-4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 ~rU ddh:-5.46161792072088e-10 ~ra~ra MO:-4.86575717744618e-08 TTI:-0.00363944503886015 ddh:-1.12887773223416e09 mO:-4.86575717744618e-08 mo:-4.86575717744618e-08 njjo:0.0204932336165709 ~ri~ri TTI:-0.00411273956460726 ddh:-1.29615147987424e-10 njjo:-0.00697478417695654 ~ru~ru ddh:-5.46161792072088e-10 ~r~r TTI:-0.00732109550918932 ddh:-2.50768778710369e-10 njjo:-0.000148244086889984 ~s TTI:-0.00217452842494425 ~sU TTI:-0.000200245165257415 ~sa TTI:-0.000200245165257415 njjo:-0.000148244086889984 ~sh TTI:-0.00920977886205972 ~sh~sh TTI:0.0731846358804615 ~su~su TTI:-0.000200245165257415 ~s~s TTI:-0.00920977886205972 ~tA TTI:-0.00703525043711547 ~tE TTI:-0.00703525043711547 ~ta~ta TTI:-0.00703525043711547 ~th MO:-7.17361325808381e-08 TTI:-3.17806761809813e-18 mO:-7.17361325808381e-08 mo:-7.17361325808381e-08 njjo:-0.000135085406972062 ~thA TTI:-0.00703525043711547 njjo:-0.00411007289510381 ~thI MO:-4.75586896143878e-08 TTI:-3.17806761809813e-18 mO:-4.75586896143878e-08 mo:-4.75586896143878e-08 njjo:-0.000135085406972062 ~tha~tha TTI:-0.00703525043711547 njjo:-0.00411007289510381 ~thi~thi MO:-4.75586896143878e-08 TTI:-3.17806761809813e-18 mO:4.75586896143878e-08 mo:-4.75586896143878e-08 njjo:-0.000135085406972062 ~ththA TTI:-0.00703525043711547 ~ththa njjo:-0.00411007289510381 ~ththi MO:-4.75586896143878e-08 TTI:-3.17806761809813e-18 mO:-4.75586896143878e08 mo:-4.75586896143878e-08 njjo:-0.000135085406972062 73
~ti~ti MO:-4.75586896143878e-08 TTI:-3.17806761809813e-18 mO:-4.75586896143878e08 mo:-4.75586896143878e-08 njjo:-0.000135085406972062 ~u~u ddh:-3.2727235499538e-19 ~vE TTI:-7.76041480165313e-05 ~va TTI:-0.00201620170280845 ~vaa TTI:-7.76041480165313e-05 ~va~va TTI:-0.00225213257296078 ~vi njjo:0.051172445470808 ~vva TTI:-7.76041480165313e-05 ~y TTI:-0.00156918911120731 njjo:0.051172445470808 ~yA~yA TTI:-0.00156918911120731 ~yU ddh:-3.2727235499538e-19 ~ya njjo:0.0441976612938515 ~ya~ya TTI:-0.0101008820850916 ddh:2.64794548787695e-09 ~yi TTI:-0.000147490387747709 ~yi~yi njjo:0.051172445470808 ~yu~yu ddh:-3.2727235499538e-19 ~yyA TTI:-0.00853169297388431 ddh:2.64794548787695e-09 ~y~y TTI:-0.000123415793751178
6.1.3 Transliteration Phase The list of English words which need to be transliterated are segmented and converted into the SVMTool format and transliterated using the trained model. SVMTool uses beam search method to produce probable Malayalam names. The model is evaluated using the same SVMTool.
6.2 Training Using WEKA Weka has a comprehensive set of classification tools. Many of these algorithms are very new and reflect an area of active development. Different algorithms perform differently depending on characteristics of the data. It is for this reason that Weka offers so many algorithms. Some algorithms can be used for both regression and/or classification while others are only for a specific type. Some can handle only nominal attributes while others can handle both nominal and ordinal/continuous variables. The explorer interface is limited to
74
using one algorithm at a time, and offers no tools to compare different methods. Weka also supports specifying a separate user supplied test dataset. The default choice in Explorer is to use the 10-fold stratified cross-validation. When the sample size is very small, bootstrapping can be used. Bootstrapping is a sampling with replacement procedure. This means that instances are chosen randomly according to specified class probabilities from the original dataset and placed into a new dataset. The resulting dataset can be many times larger than the original dataset. 6.2.1 Output using WEKA Using WEKA 32 multiclasses were classified using classifiers- J48, NaveBayes, NBTree. The following table shows the accuracy obtained for different classes using different classifiers.
Class
J48
Nave Bayes
NBTree
Class#a
89.34
83.67
86.55
Class#aa
98.85
98.85
98.8
Class#b
84.21
93.74
84.2
6.2.2 Output source code The following is the output source code obtained for class#a when trained using J48 classifier === Run information === Scheme: weka.classifiers.trees.J48 -C 0.25 -M 2 Relation: class-#a Instances: 1004 Attributes: 298 [list of attributes omitted] Test mode: evaluate on training data === Classifier model (full training set) === 75
J48 pruned tree -----------------feature-214 0 | feature-62 0: A (3.0) Number of Leaves : 15 Size of the tree :
29
Time taken to build model: 4.63 seconds === Evaluation on training set === === Summary === Correctly Classified Instances 897 Incorrectly Classified Instances 107 Kappa statistic 0.3309 Mean absolute error 0.0945 Root mean squared error 0.2174 Relative absolute error 80.3337 % Root relative squared error 90.0205 % 76
89.3426 % 10.6574 %
Total Number of Instances
1004
=== Detailed Accuracy By Class === TP Rate FP Rate Precision Recall F-Measure ROC Area Class 0.997 0.77 0.893 0.997 0.942 0.65 a 0.233 0.003 0.912 0.233 0.371 0.653 A 0 0 0 0 0 0.603 e 0 0 0 0 0 0.603 i === Confusion Matrix === a b c d