Application code analysis regarding this practice is static code analysis that requires reading of code through
analysis tools, to understand the structure of the application with associated metrics (Complexity, size … etc). This
analysis can also lead to uncovering the cause of issues/concerns with the application as raised by the stake holders.
Static code analysis
Static code analysis is the analysis of software source code that is performed without actually executing the
application program. Static analysis involves using automated tools to analyze and understand code structure on what
this application can do on a potential execution. This allows the analyst to create diagrammatic or graphical
representations of the code, which gives them a better understanding of the executed code's effects.
Automated analysis tools are preferred as compare to manual code reading as manual reading can be very time consuming.
In addition, to perform a manual analysis effectively, the code analyst must first know what vulnerabilities look like
before they can rigorously examine the code. Static analysis tools compare favorably to manual analysis because they
are faster, can be utilized to evaluate programs much more frequently, and can encapsulate more knowledge in a way that
does not require the tool operator to have the same level of expertise as a human analyst. On the other hand, these
tools cannot replace a human analyst; they can only speed up those tasks that are easily automated. However, tools are
very good at producing artifacts for the following:
-
Application governance
-
Change analysis
-
Reports for project estimation
-
Lists and diagrams for program understanding
-
Data flow analysis
-
Test planning
-
Application understanding,
-
Componentization of the application
-
Detail interfaces with
-
Data stores
-
Transaction
-
Other application
-
etc
-
Metrics
-
Complexity
-
Lines of code
-
Functionality
-
etc
Metrics
Analysis of metrics generated by source code analysis tools provide measure of the degree to which the code under
consideration possesses a given attribute. An attribute is a property of the code.
Metrics provide analyst with critical insight into the application. When considered separately, a metric such as the
number of defects per 1000 lines of code provides very little business or technical meaning. However, a metric such as
the complexity value of source code artifact, where higher the number means more complexity, provides a much more
useful and interesting relative value. It can be used to compare and contrast a given system's complexity and defect
density and thus provides analyst useful data that can lead to problem areas quickly.
Source code analysis tools provide array of different matrices that provide vital information on application source
code structure. Practical implementation can use larger set of matrices that can provide significant incite into the
structure of application under consideration. To understand the structure of an application, metrics need to be collect
at different artifact level. Different artifact level exists in source code structure such as modules, files, screens,
handlers, libraries etc. The most common metrics provided by various tools are
-
Complexity
-
Functionality distribution
-
Use (artifact used by other artifacts)
-
Lines of code
-
Defect possibility
Collected metrics can be combine with data collected in current state analysis to validate perceived issues with the
application. Some of the metrics can also uncover root cause of issues that stake holders report they have with the
application. For example if the issue is lot of defects in a particular area then combination of Lines of code,
Complexity and Defect possibility can rule if the problem is with the structure of that particular part of application.
Here functional density can give an incite if there is a need to distribute or break functionality provided by this
part into more manageable part each providing portion of functionality.
Collected metrics data combined with other analysis data (data flow, change analysis, interfaces etc) aids the
architects in understanding the over all structure of the application and will help identify road map that has lowest
risk to achieve organization goal. Determining a particular road map out of the data select is still subjective and
depends on how effectively collected data is interpreted. However, selected road map can be further analyzed by doing
selective analysis.
Selective analysis
To achieve the goal of this practice, that is "determine application modernization strategy" will require frequent and
deep analysis of various defined parts of the application. Most common part that contributes in determining a strategy
is the application interface with other applications, data stores, user, third party code etc. It is important to
define a boundary for the application before deep analysis is performed. Having a defined boundary aids in
identification of interfaces of the application. It is uncommon but not rare to include previously identified external
artifacts in the application boundary. Care is required to include external artifacts only when it helps defining a
modernization strategy or alternative. Avoid temptation to include external artifacts to simplify analysis as this can
easily increase the scope of analysis beyond an application.
Most common deep analysis used is impact analysis for a change. Impact analysis will help determine what will be
impacted by change as one progress on steps of a given road map and associated strategy. Impact analysis can quickly
determine what need to change and what would be the complexity associated with that change. Number of impact analysis
will be required to determine all the change complexity for a given road map. However, trend can quickly be determined
what is the risk associated with a particular road map.
Input:
-
Project initiation is complete
-
Application has been identified
-
Source code is delivered
-
Developers are available to work with assessment team on this study
Output:
-
Detail architecture structure of the application
-
Validation of strategy that has least complexity and risk
|