Community articles — Reports
Write up experiments and research with LaTeX templates for project and lab reports—including layout guidelines to help guide you through the writing process.
最近的

O trabalho descreve um simples aplicativo desenvolvido em para sistema android studio

We were asked to design our own Electrocardiogram. Obviously, recording heart beats without any noise is a real challenge. Seeing the prices of those kinds of devices, we could already imagine that reaching great performances with our means would be difficult. However, the real goal of this project was to be able to analyze the different problems we encountered and to think about possible improvements we could have provided to overcome them.

Principal Components Analysis (PCA) and Canonical Correlation Analysis (CCA) are among the methods used in Multivariate Data Analysis. PCA is concerned with explaining the variance-covariance structure of a set of variables through a few linear combinations of these variables. Its general objectives are data reduction and interpretation. CCA seeks to identify and quantify the associations between two sets of variables i.e Pulp fibres and Paper variables.PCA shows that the first PC already exceeds 90% of the total variability. According to the proportion of variability explained by each canonical variable , the results suggest that the first two canonical correlations seem to be sufficient to explain the structure between Pulp and Paper characteristics with 98.86%. Despite the fact that the first the two canonical variables keep 98% of common variability, 78% was kept in the pulp fiber set and about 94% of the paper set as a whole. In the proportion of opposite canonical variable,there were approximately 64% for the paper set of variables and 78% for the pulp fiber set of variables kept for the two respectively.

The goal of this project is to explore both the theory behind the Extended Kalman Filter and the way it was used to localize a four-wheeled mobile-robot. This can be achieved by estimating in real-time the pose of the robot, while using a pre-acquired map through Laser Range Finder (LRF). The LRF is used to scan the environment, which is represented through line segments. Through a prediction step, the robot simulates its kinematic model to predict his current position. In order to minimize the difference between the matched lines from the global and local maps, a update step is implemented. It should be noted that every measurement has associated uncertainty that needs to be taken into account when performing each step of the Extended Kalman Filter. These uncertainties, or noise, are described by covariance matrices that play a very important role in the algorithm. Since we are dealing with an indoor structured environment, mainly composed by walls and straight-edged objects, the line segment representation of the maps was the chosen method to approach the problem.

Math project.

With the emergence of systems archiving and distribution of PACS medical imaging, also emerged the need to store those images off-site environment, this project aims to study this need, trying to offer a service within an information infrastructure that can integrate the environment site with a remote environment, a transparent and heterogeneous, taking account of the needs of users of teleradiology. Keywords: PACS, teleradiology, Medical Imaging, Archiving.

Benford law states that the occurrence of digits from 0-9 in a large set of data is not uniformly distributed but instead in a decreasing logarithmic distribution with 1 occurring at most. Almost all set of data follows this trend however this law is widely used as a base for various fraud detection and forensic accounting. Benford’s law is an observation that leading digits in data derived from measurements doesn’t follow uniform distribution. Different financial statements such as cash flows, income statement and balance sheet of the 20 tech companies of the Fortune 500 are analyzed in this project. Cash flow is the net amount of cash and cash-equivalents moving into and out of a business. Income statement is a financial statement that measures a company's financial performance over a specific accounting period. Balance sheet is a financial statement that summarizes a company's assets, liabilities and shareholders’ equity at a specific point in time. All of these data of financial statements are extracted from Morning Star database and are analyzed by Python program written by me.I also wrote the Python program to calculate Benford's second digit and third digit probability using the formula. I would like to thank Prof. Erin Wagner and Dr. Courtney Taylor for helping in this research project.

Gene regulatory networks have an important role to study the behaviour of genes. By analysing these Gene Regulatory Networks we can get the detailed information i.e. the occurrence of diseases by changing behaviour of GRNs. Many different approaches are used (i.e. qualitative modelling and hybrid modelling) and various tools (i.e. GenoTech, GINsim) have been developed to model and simulate gene regulatory networks. GenoTech allows the user to specify a GRN on Graphical User Interface (GUI) according to the asynchronous multivalued logical functions of René Thomas, and to simulate and/or analyse its qualitative dynamical behaviour. René Thomas discrete modelling of gene regulatory network (GRN) is a well known approach to study the dynamics of genes. It deals with some parameters which reflect the possible targets of trajectories. Those parameters are priory unknown. These unknown parameters are fetched using another model checking tool SMBioNet. SMBioNet produces all the possible parameters satisfying the given Computational Logic Tree (CTL) formula as input. This approach involving logical parameters and conditions also known as qualitative modelling of GRN. However, this approach neglects the time delays for a gene to pass from one level of expression to another one i.e. inhibition to activation and vice versa. To find out these time delays, another modelling tool HyTech is used to perform hybrid modelling of GRN. We have developed a Java based tool called GenNet http://asanian.com/gennet to facilitate the model checking user by providing a unique GUI layout for both qualitative and quantitative modelling of GRNs. As we discussed, three separate modelling tools are used for complete modelling and analysis of a GRN. This process is much lengthy and takes too much time. GenNet assists the modelling users by providing some extra features i.e. CTL editor, parameters filtering and input/output files management. GenNet takes a GRN network as input and does all the rest of computations i.e. CTL verification, K-parameters generation, parameter implication to GRN, state graph, hybrid modelling and parameter filtration automatically. GenNet serves the user by computing the results within seconds that were taking hours and days of manual computation

The impact crater of a small metal ball of 63.7 grams (0.0637kg) is dropped from 8 different heights, ranging from 0.20m to 0.90m was observed. A mean was measured for the craters diameter. Using the equation E=mg$\Delta$h given that we have m, and g is a constant of 9.81 we can find the kinetic energy of the ball on impact. The relationship between crater diameter, D, and impact energy, E, is given by D=kE$^n$ where K is constant and n is found by the gradient of the graph and is also constant. This can be modified to give $\log D = n\log E + \log k$.
\begin
Discover why 18 million people worldwide trust Overleaf with their work.