Machine code (also called machine language) is a software that is executed directly by the CPU. Machine code is CPU-dependent; it is a series of 1s and 0s that translate to instructions that are understood by the CPU.
Source code is computer programming language instructions which are written in text that must be translated into machine code before execution by the CPU.
Assembly language is a low-level computer programming language.
Compilers take source code, such as C or Basic, and compile it into machine code.
Interpreted languages differ from compiled languages: interpreted code (such as shell code) is compiled on the fly each time the program is run.
Procedural languages (also called procedure-oriented languages) use subroutines, procedures, and functions.
Object-oriented languages attempt to model the real world through the use of objects which combine methods and data.
The different generations of languages:
- First-generation language: machine code
- Second-generation language: assembly
- Third-generation language: COBOL, C, Basic
- Fourth-generation language: ColdFusion, Progress 4GL, Oracle Reports
Application Development Methods
The Waterfall Model is a linear application development model that uses rigid phases; when one phase ends, the next begins.
The waterfall model contains the following steps:
- System requirements
- Software Requirements
- Program Design
An unmodified waterfall does not allow iteration: going back to previous steps. This places a heavy planning burden on the earlier steps. Also, since each subsequent step cannot begin until the previous step ends, any delays in earlier steps cascade through to the later steps.
The unmodified Waterfall Model does not allow going back. The modified Waterfall Model allows going back at least one step.
The Sashimi Model has highly overlapping steps; it can be thought of as a real-world successor to the Waterfall Model (and is sometimes called the Sashimi Waterfall Model).
Sashimi’s steps are similar to the Waterfall Model’s; the difference is the explicit overlapping,
Scrum contain small teams of developers, called the Scrum Team. They are supported by a Scrum Master, a senior member of the organization who acts like a coach for the team. Finally, the Product Owner is the voice of the business unit.
Extreme Programming (XP) is an Agile development method that uses pairs of programmers who work off a detailed specification.
The Spiral Model is a software development model designed to control risk.
The spiral model repeats steps of a project, starting with modest goals, and expanding outwards in ever wider spirals (called rounds). Each round of the spiral constitutes a project, and each round may follow traditional software development methodology such as Modified Waterfall. A risk analysis is performed each round.
SDLC focuses on security when used in context of the exam.
No metter what development model is used, these principles are important in order to ensure that the resulting software is secure:
- security in the requirements – even before the developers design the software, the organization should determine what security features the software needs.
- security in the design – the design of the application should include security features, ranging from input checking, dtrong authentication, audit logs.
- security in testing – the organization needs to test all the security requirements and design characteristics before declaring the software ready for production use.
- security in the implementation
- ongoing security testing – after an application is implemented, security testing should be performed regularly, in order to make sure that no new security defects are introduced into the software.
Software escrow describes the process of having a third party store an archive or computer software.
Software vulnerabilities testing
Software testing methods
- Static testing – tests the code passively, the code is not running, this includes syntax checking, code reviews.
- Dynamic testing – tests the code while it executing it.
- White box testing – gives the tester access to program source code.
- Black box testing – gives the tester no internal details, the application is treated as a black box that receives inputs.
Software testing levels
- Unit Testing – Low-level tests of software components, such as functions, procedures or objects
- Installation Testing – Testing software as it is installed and first operated
- Integration Testing – Testing multiple software components as they are combined into a working system.
- Regression Testing – Testing software after updates, modifications, or patches • Acceptance Testing: testing to ensure the software meets the customer’s operational requirements.
Fuzzing (also called fuzz testing) is a type of black box testing that enters random, malformed data as inputs into software programs to determine if they will crash.
Combinatorial software testing is a black-box testing method that seeks to identify and test all unique combinations of software inputs. An example of combinatorial software testing is pairwise testing (also called all pairs testing).
Software Capability Maturity Model
The Software Capability Maturity Model (CMM) is a maturity framework for evaluating and improving the software development process.
The goal of CMM is to develop a methodical framework for creating quality software which allows measurable and repeatable results.
The five levels of CMM :
- Initial: The software process is characterized as ad hoc, and occasionally even chaotic.
- Repeatable: Basic project management processes are established to track cost, schedule, and functionality.
- Defined: The software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization.
- Managed: Detailed measures of the software process and product quality are collected, analyzed, and used to control the process. Both the software process and products are quantitatively understood and controlled.
- Optimizing: Continual process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies.
A database is a structured collection of related data.
Types of databases :
- relational databases – the structure of the relation database its defined by its schema. Records are called rows, and rows are stored in tables. Databases must ensure the integrity of the data. There are three integrity issues that must be addressed beyond the correctness of the data itself: referential integrity (every foreign key in a secondary table matches a primary key in the parent table), semantic integrity (each column value is consistent with attribute data type) and entity integrity (each tuple has a unique primary key that is not null). Data definition language (DDL) is used to create, modify and delete tables. Data manipulation language (DML) is used to query and update data stored in tables.
- hierarchical – data in a hierarchical database is arranged in tree structures, with parent records at the top of the database, and a hierarchy of child records in successive layers.
- object oriented – the objects in a object database include data records, as well as their methods.
Database normalization seeks to make the data in a database table logically concise, organized, and consistent. Normalization removes redundant data, and improves the integrity and availability of the database.
Databases may be highly available, replicated over multiple servers containing multiple copies of data. Database replication mirrors a live database, allowing simultaneous reads and writes to multiple replicated databases. A two-phase commit can be used to ensure integrity.
A shadow database is similar to a replicated database with one key difference, a shadow database mirrors all changes made to the primary database, but the clients do not have access to the shadow.
Expert systems consist of two main components. The first is a knowledge base that consists of “if/then” statements. These statements contain rules that the expert system uses to make decisions. The second component is an inference engine that follows the tree formed by the knowledge base, and fires a rule when there is a match.
Neural networks mimic the biological function of the brain. A neural-network accumulates knowledge by observing events; it measures their inputs and outcome. Over time, the neural network becomes proficient at correctly predicting an outcome because it has observers several repetitions of the circumstances ans is also told the outcome each time.