100% found this document useful (1 vote)
314 views

Database Assignment ESOFT

Database assignment ESOFT

Uploaded by

Vihanga Sankalpa
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
314 views

Database Assignment ESOFT

Database assignment ESOFT

Uploaded by

Vihanga Sankalpa
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 39

Higher Nationals

Internal verification of assessment decisions – BTEC (RQF)

INTERNAL VERIFICATION – ASSESSMENT DECISIONS


Programme title BTEC HND in Computing

Assessor Internal
Verifier
Unit 04: Database Design & Development
Unit(s)
Database Solution for Quiet Attic Films
Assignment title

Student’s name
List which assessment Pass Merit Distinction
criteria the Assessor has
awarded.
INTERNAL VERIFIER CHECKLIST
Do the assessment criteria awarded
match those shown in the Y/N
assignment brief?
Is the Pass/Merit/Distinction grade
awarded justified by the assessor’s Y/N
comments on the student work?
Has the work been assessed Y/N
accurately?
Is the feedback to the student:
Give details:
• Constructive? Y/N
• Linked to relevant assessment Y/N
criteria? Y/N
• Identifying opportunities for
improved performance? Y/N
• Agreeing actions?
Does the assessment decision need Y/N
amending?

Assessor signature Date

Internal Verifier signature Date


Programme Leader signature (if
required) Date
Confirm action completed
Remedial action
taken
Give details:

Assessor signature Date


Internal
Verifier Date
signature
Programme Leader
signature (if Date
required)
Higher Nationals - Summative Assignment Feedback Form
Student Name/ID
Unit Title Unit 04: Database Design & Development

Assignment Number 1 Assessor


Date
Submission Date Received 1st
submission
Date Received 2nd
Re-submission Date submission
Assessor Feedback:
LO1 Use an appropriate design tool to design a relational database system for a substantial
problem
Pass, Merit & P1 M1 D1
Distinction Descripts

LO2 Develop a fully functional relational database system, based on an existing system
design
Pass, Merit & P2 P3 M2 M3 D2
Distinction Descripts

LO3 Test the system against user and system requirements.


Pass, Merit & P4 M4 D2
Distinction Descripts

LO4 Produce technical and user documentation.


Pass, Merit & P5 M5 D3
Distinction Descripts

Grade: Assessor Signature: Date:


Resubmission Feedback:

Grade: Assessor Signature: Date:


Internal Verifier’s Comments:

Signature & Date:

* Please note that grade decisions are provisional. They are only confirmed once internal
and external moderation has taken place and grades decisions have been agreed at the
assessment board.
Assignment Feedback
Formative Feedback: Assessor to Student

Action Plan

Summative feedback

Feedback: Student to Assessor

Assessor Date
signature

Student Date
signature
Pearson Higher Nationals in
Computing

Unit 04: Database Design & Development


Assignment 01

General Guidelines

1. A Cover page or title page – You should always attach a title page to your
assignment. Use previous page as your cover sheet and make sure all the details are
accurately filled.
2. Attach this brief as the first section of your assignment.
3. All the assignments should be prepared using a word processing software.
4. All the assignments should be printed on A4 sized papers. Use single side printing.
5. Allow 1” for top, bottom , right margins and 1.25” for the left margin of each page.

Word Processing Rules

1. The font size should be 12 point, and should be in the style of Time New Roman.
2. Use 1.5 line spacing. Left justify all paragraphs.
3. Ensure that all the headings are consistent in terms of the font size and font style.
4. Use footer function in the word processor to insert Your Name, Subject,
Assignment No, and Page Number on each page. This is useful if individual sheets
become detached for any reason.
5. Use word processing application spell check and grammar check function to help
editing your assignment.

Important Points:

1. It is strictly prohibited to use textboxes to add texts in the assignments, except for
the compulsory information. eg: Figures, tables of comparison etc. Adding text boxes
in the body except for the before mentioned compulsory information will result in
rejection of your work.
2. Carefully check the hand in date and the instructions given in the assignment. Late
submissions will not be accepted.
3. Ensure that you give yourself enough time to complete the assignment by the due
date.
4. Excuses of any nature will not be accepted for failure to hand in the work on time.
5. You must take responsibility for managing your own time effectively.
6. If you are unable to hand in your assignment on time and have valid reasons such as
illness, you may apply (in writing) for an extension.
7. Failure to achieve at least PASS criteria will result in a REFERRAL grade .
8. Non-submission of work without valid reasons will lead to an automatic RE FERRAL.
You will then be asked to complete an alternative assignment.
9. If you use other people’s work or ideas in your assignment, reference them properly
using HARVARD referencing system to avoid plagiarism. You have to provide both
in-text citation and a reference list.
10. If you are proven to be guilty of plagiarism or any academic misconduct, your grade
could be reduced to A REFERRAL or at worst you could be expelled from the course

Student Declaration

I hereby, declare that I know what plagiarism entails, namely to use another’s work and to
present it as my own without attributing the sources in the correct form. I further
understand what it means to copy another’s work.

1. I know that plagiarism is a punishable offence because it constitutes theft.


2. I understand the plagiarism and copying policy of Edexcel UK.
3. I know what the consequences will be if I plagiarise or copy another’s work in any of
the assignments for this program.
4. I declare therefore that all work presented by me for every aspect of my program,
will be my own, and where I have made use of another’s work, I will attribute the
source in the correct way.
5. I acknowledge that the attachment of this document signed or not, constitutes a
binding agreement between myself and Pearson, UK.
6. I understand that my assignment will not be considered as submitted if this
document is not attached to the assignment.

Student’s Signature: Date:


(Provide E-mail ID) (Provide Submission Date)

Higher National Diploma in Computing


Assignment Brief
Student Name /ID Number

Unit Number and Title Unit 4: Database Design & Development

Academic Year 2022/23

Unit Tutor

Assignment Title Data base system for Quiet Attic Films

Issue Date

Submission Date

IV Name & Date

Submission format

Part 1: The submission should be in the form of an individual written reportwritten in a concise,
formal business style using single spacing and font size 12. You are required to make use of
headings, paragraphs and subsections as appropriate, and all work must be supported with
research and referenced using Harvard referencing system. Please also provide in-text citation
and bibliography using Harvard referencing system. The recommended word limit is 3,000–
3,500 words, although you will not be penalised for exceeding the total word limit.
Part 2: The submission should be in the form of a fully functional relational database system
demonstrated to the Tutor; and an individual written report (please see details in Part 1 above).
Part 3: The submission should be in the form of a witness statement of the testing completed
by the Tutor; technical documentation; and a written report (please see details in Part 1 above).
Unit Learning Outcomes:

LO1 Use an appropriate design tool to design a relational database system for a substantial
problem.
LO2 Develop a fully functional relational database system, based on an existing system design.
LO3 Test the system against user and system requirements.
LO4 Produce technical and user documentation.
Assignment Brief and Guidance:
Assignment brief
Quiet Attic Films is a film production company based in London, England who specialize in
making short information films and advertisements for television. They want you to design
and implement a database that meets the requirements for their data. These requirements
are specified in this scenario and the examples of paper documents kept by the company
shown below.
Quiet Attic Films organize their data around the concept of a ‘production’. A production is
specified as being for a particular client; but note that a client might have more than one
production at any time. A production will take place at one or more locations. A production
will also use a number of, what are called, properties, which might be anything from an actual
property like a building, to costumes or small items of any sort. It is important to keep a
record of which properties are required at which location.
There should also be a record kept of the staff types that are assigned to productions

Activity 1
Identify the user and system requirements to design a database for the above scenarioand
design a relational database system using conceptual design (ER Model) by including
identifiers (primary Key) of entities and cardinalities, participations of relationships. Convert
the ER Model into logical database design using relational database model including primary
keys foreign keys and referential Integrities.
It should contain at least five interrelated tables. Check whether the provided logical design
is normalised. If not, normalize the database by removing the anomalies.
(Note:-It is allowed to have your own assumptions and related attributes within the scope
of the case study given)
Design set of simple interfaces to input and output for the above scenario using Wireframe
or any interface-designing tool.Evaluate the effectiveness of the given design (ERD and
Logical design) in terms of the identified user and system requirements.

Activity 2
Develop a relational database system according to the ER diagram you have created (Use
SQL DDL statements). Provide evidence of the use of a suitable IDE to create a simple
interface to insert, update and delete data in the database. Implement proper security
mechanisms in the developed database and evaluate the database solution developed in
terms of its effectiveness with relevance to the user and system requirements identified,
system security mechanisms (EX: -User groups, access permissions) and the maintenance of
the database. Suggest improvements for any identified problems.

Assess the usage of the below SQL statements with the examples from the developed
database to prove that the data extracted through them are meaningful and relevant to the
given scenario.
Select/Where / Update / Between / In / Group by / Order by / Having
Activity 3
3.1 Provide a suitable test plan to test the system against user and system requirements.
provide relevant test cases for the database you have implemented. Assess how the selected
test data can be used to improve the effectiveness of testing.
Note:- Learner needs to give expected results in a tabular format and screenshots of the
actual results with the conclusion

3.2 Get independent feedback on your database solution from the non-technical users and
some developers (use surveys, questioners, interviews or any other feedback collecting
method) and make a separate conclusion from the feedbacks.
Activity 4

Produce technical and user documentation for a fully functional system, including data flow
diagrams showing movement of data through the system, and flowcharts describing how the
system works. Evaluate the developed database by suggesting future enhancements to
ensure the effectiveness of the system.
Grading Criteria Achieved Feedback

LO1 Use an appropriate design tool to design a relational


database system for a substantial problem.

P1 Design a relational database system using appropriate


design tools and techniques, containing at least four
interrelated tables, with clear statements of user and system
requirements.
M1 Produce a comprehensive design for a fully-functional
system, which includes interface and output designs, data
validations and data normalization.
D1 Evaluate the effectiveness of the design in relation to user
and system requirements.

LO2 Develop a fully-functional relational database system,


based on an existing system design
P2 Develop the database system with evidence of user
interface, output and data validations, and querying across
multiple tables.

P3 Implement a query language into the relational database


system

.
M2 Implement a fullyfunctional database system, which
includes system security and database maintenance.
M3 Assess whether meaningful data has been extracted
through the use of query tools to produce appropriate
management information
LO3 Test the systems against user and system requirements

P4 Test the system against user and system requirements.

M4 Assess the effectiveness of the testing, including an


explanation of the choice of test data used.
D2 Evaluate the effectiveness of the database solution in
relation to user and system requirements and suggest
improvements.
LO4 Produce technical and user documentation

P5 Produce technical and user documentation.

M5 Produce technical and user documentation for a fully-


functional system, including data flow diagrams and
flowcharts, describing how the system works.

D3 Evaluate the database in terms of improvements needed


to ensure the continued effectiveness of the system.
Introduction

Quiet Attic Films, located in the bustling heart of London, England, is renowned for
producing short informational films and captivating TV advertisements. As the company
grows and takes on more diverse projects, the need for a more efficient data management
system becomes increasingly apparent. Currently reliant on paper-based documentation,
Quiet Attic Films faces challenges related to efficiency, accuracy, and accessibility. The shift
to a digital database system promises to modernize operations, reduce errors, and boost
overall productivity.
At the core of Quiet Attic Films' operations are 'productions,' each uniquely tailored for a
specific client. Clients often have multiple productions running concurrently, necessitating a
system that can handle this complexity. Each production involves various types of employees
and a range of properties, including costumes, vehicles, and buildings. The proposed database
will centralize and streamline the management of these elements, replacing the cumbersome
paper-based system with a more organized and efficient digital solution.
The objective of this report is to design and implement a database that addresses the specific
needs of Quiet Attic Films. By analyzing the current workflow and existing paper documents,
we can identify the key areas where a digital system can bring substantial improvements. The
new database will be structured around productions, linking each to its respective client,
thereby ensuring that all relevant information is easily accessible and manageable.
A significant feature of the database will be its capability to manage various employee roles
and properties associated with each production. This includes tracking the availability and
allocation of resources, ensuring that logistical aspects are well-coordinated. By knowing
which properties are needed at specific locations and which staff members are assigned to
particular tasks, the database will enhance planning and execution, ensuring smooth and
efficient project management for every production.
Activity 01.

LO1 Use an appropriate design tool to design a relational database system


for a substantial problem.

1.1. User and system requirements for Quiet Attic Films.

1.1.1. User Requirements

In software engineering, user and system requirements play a crucial role in the successful
development of any system. User requirements specifically refer to the broad, high-level
needs that describe the expected functionalities and constraints of the system. These
requirements are typically presented in natural language and visual formats to ensure they are
easily understandable by end users. They serve as a fundamental basis for the design and
implementation of the system, guiding developers in creating a product that meets user
expectations and needs.
Understanding and clearly defining user requirements is essential because they provide a
roadmap for what the system should achieve. They encompass the services the system must
deliver and the boundaries within which it must operate. This detailed articulation helps in
aligning the final product with user expectations, ensuring that the system is user-friendly,
efficient, and effective in addressing the users' problems. User requirements facilitate
communication between stakeholders, including clients, developers, and project managers.
They help in setting clear goals and objectives, reducing the risk of misunderstandings and
ensuring that all parties have a shared vision of the project's outcome. By capturing the user's
perspective, these requirements ensure that the system is developed with the end-user's needs
at the forefront, leading to higher satisfaction and better user adoption rates. User
requirements are vital in software engineering as they define the necessary services and
constraints of the system in a way that is understandable to users. They are a key element in
the development process, ensuring that the final product aligns with user needs and
expectations.

1.1.2. System Requirements

System requirements are the detailed technical prerequisites essential for a system to perform
its intended functions. These requirements encompass several critical aspects:
Functional Specifications - These define what the system must do. They outline the core
functions and features that the system needs to perform to meet the user requirements
effectively.

Data Specifications - These describe the data the system must process. They detail the types,
formats, and sources of data the system will handle, ensuring it can manage and utilize this
information appropriately.

Quality Specifications - These cover non-functional requirements, such as,


Precision: The accuracy of the system's operations and outputs.
Dependability: The system's reliability and availability.
Efficiency: The performance and resource usage of the system.
User-friendliness: The ease with which users can interact with the system.
Constraints: These define the limits within which the system must operate.
Constraints can include hardware and software limitations, regulatory requirements, and other
external factors that impact the system's design and functionality.

System requirements are vital for guiding the development process, ensuring that the final
product not only meets user needs but also adheres to technical standards and operational
limitations. They provide a comprehensive framework for what the system must achieve and
how it should perform, ultimately ensuring that the system is robust, efficient, and aligned
with both user and business objectives.

Table 1 - user requirements and system requirements

User Requirements System Requirements


 The system must include user-friendly A table named "Production" should be
functionalities that facilitate the seamless created.
creation, modification, and deletion of A table named "Client" should be created.
productions, clients, locations, properties, A table named "Location" should be
and staff types. created.
 A table named "Staff" should be created.
 Appropriate security measures must be A table named "Staff Type" should be
implemented to protect confidential data created.
stored in the database.  A table named "Property" should be
created.
 The database should offer robust search A table named "Payment" should be
capabilities, enabling users to effortlessly created.
find productions, clients, locations, A table named "UserAccount" should be
properties, and staff types using various created.
criteria such as name, location, date, or A table named "Client_ContactNo"
client. should be created.
 A table named "Location_ContactNo"
 The system must track the status of should be created.
productions, clearly indicating whether A table named "Staff_ContactNo" should
they are in pre-production, production, or be created.
post-production stages.  A table named "Production_Staff" should
be created.
 The system must allow for the scheduling  A table named "Production_Location"
of staff and equipment for each should be created.
production.  A table named "Location_Property"
should be created.
 The database must monitor user The "Production" table should include
permissions and access levels, ensuring "Payment_ID" and "Client_ID" as foreign
that sensitive information is accessible keys.
only to authorized individuals.  The "Staff" table should include
"StaffType_ID" as a foreign key.
 The database must include a backup and The "UserAccount" table should include
recovery mechanism to prevent data loss "Staff_ID" as a foreign key.
in the event of system failures or other The "Client_ContactNo" table should
disasters. include "Client_ID" as a foreign key.
 The "Location_ContactNo" table should
include "Location_ID" as a foreign key.
 The "Staff_ContactNo" table should
include "Staff_ID" as a foreign key.
 The "Production_Staff" table should
include "Prod_ID" and "Staff_ID" as
foreign keys.
 The "Production_Location" table should
include "Prod_ID" and "Location_ID" as
foreign keys.
  The "Location_Property" table should
include "Location_ID" and "Property_ID"
as foreign keys.

1.1.3. Entity Relational Diagram (ER Diagram) for Quiet Attic Films.
Table 2 - entity table

Entity Attribute Primary Key


Client Client_ID, NIC,First_Name, Client_ID
Last_Name,
Birth_Day. No,Line1,Line2,
City, Email,State,
Contact _No
Production Prod_ID, Prod_Name, Prod_ID
Prod_Type, Start_Date,
End_Date, Prod_State,
State, Note
Staff Staff_ID, First_Name, Staff_ID
Last_Name, Email, State,
Contact_No
Location Location_ID, Location_ID
Location_Name, No,
Line1,Line2, City,
Email,State,Contact _No
Staff Type StaffType_ID, Type, State, StaffType_ID
Fee
Payment Payment_ID, Amount, Payment_ID
State, Date
Property Property_ID,Property_Name, Property_ID
State,
Property_Type
UserAccount User_ID, Username, User_ID
Password, First_Name,
Last_Name, Profile_Picture,
State

Table 3 - foreign key table

Entity Foreign Key


Production Client_ID, Payment_ID
UserAccount Staff_ID
Staff StaffType_ID Staff StaffType_ID

1.1.4. Relational Schema

Production (Prod_ID, Prod_Name, Prod_Type, Start_Date, End_Date, Client_ID,


Payment_ID, Note, Prod_State, State)
Client(Client_ID, NIC, First_Name, Last_Name, Birth_Day, No, Line1, Line2, City,
Email, State)
Client_ContactNo(Client_ID, Contact_No, State)
Location (Location_ID, Location_Name, No, Line1, Line2, City, Email, State)
Location_ContactNo(Location_ID, Contact_No)
StaffType (StaffType_ID, Type, Fee, State)
Staff (Staff_ID, First_Name, Last_Name, Email, StaffType_ID, State)
Staff_ContactNo(Staff_ID, Contact_No)
Property (Property_ID, Property_Name, Property_Type, State)
Payment (Payment_ID, Amount, Date, State)
UserAccount(User_ID, Username, Password, First_Name, Last_Name, Profile_Picture,
Staff_ID, State)
Production_Staff(Prod_ID, Staff_ID, Date)
Production_Location(Prod_ID, Location_ID, Date)
Location_Property(Location_ID, Property_ID, Date)

1.1.5. Logical Schema


Figure 1: Logical Schema Diagram
1.2. Data Normalization

Normalization, a key database design methodology, significantly reduces data redundancy


and mitigates anomalies such as insertion, update, and deletion anomalies. This process
entails decomposing larger tables into smaller, more manageable ones, which are then
interconnected through defined relationships. The primary goal of normalization in SQL is to
eliminate superfluous or repetitive data and ensure that data is stored in a logical, coherent
manner.
The concept of data normalization was introduced by Edgar Codd, the pioneer of the
relational model. Codd's work began with the establishment of the First Normal Form (1NF),
which set the foundation for the normalization process. He later expanded this theory by
introducing the Second Normal Form (2NF) and Third Normal Form (3NF), each building
upon the principles of its predecessor to further refine and organize data structures. These
normal forms addressed various types of redundancy and dependencies, ensuring that the data
remained consistent and free from anomalies.
In addition to his work on the first three normal forms, Codd collaborated with Raymond F.
Boyce to formulate the Boyce-Codd Normal Form (BCNF). BCNF is a higher-level normal
form that addresses certain types of redundancy not covered by 3NF, providing an even more
robust framework for database normalization. The Boyce-Codd Normal Form ensures that the
database schema is free of anomalies by ensuring that every determinant is a candidate key,
thus enhancing the integrity and efficiency of the database design.
Normalization involves a structured approach to decomposing tables. In 1NF, tables are
organized so that each column contains atomic, indivisible values, and each record is unique.
Moving to 2NF, tables must first meet all the criteria of 1NF and also ensure that all non-key
attributes are fully functionally dependent on the primary key. Finally, 3NF requires that
tables meet all the criteria of 2NF and that all the attributes are only dependent on the primary
key, removing transitive dependencies.
By adhering to these principles, database designers can create systems that not only store data
more efficiently but also simplify data management and maintenance. This logical structuring
of data minimizes redundancy and ensures data integrity, ultimately leading to more reliable
and robust database systems.

Normalization Forms,
1. 1NF (First Normal Form)
2. 2NF (Second Normal Form)
3. 3NF (Third Normal Form)
4. BCNF (Boyce-Codd Normal Form)
5. 4NF (Fourth Normal Form)
6. 5NF (Fifth Normal Form)
7. 6NF (Sixth Normal Form)

Ongoing advancements in the Theory of Data Normalization in MySQL server are being
actively pursued, with discussions about the 6th Normal Form taking place. However, in
most practical applications, achieving the best results through normalization is typically
accomplished by adhering to the 3rd Normal Form.
Table 4 - unnormalized data

P_ID Client Locations Production Staff Properties N.of


Type Days
9 Colombo 1. Galle Face Travel 1 x Director High- 4
Tourism Green, Colombo Documentary definition
Board camera
2. Gangaramaya 2 x Camera Drone for
Temple, Colombo crew aerial shots
1 x Sound
Technician
1 x Runner
10 Kandy 1. Temple of the Cultural Film 2 x Camera Traditional 3
Cultural Tooth, Kandy crew costumes
Association
2.Peradeniya 1 x Lighting Lighting
Botanical Garden, Technician equipment
Kandy
1 x Director
2 x Dancers
11 Wildlife 1. Yala National Wildlife 1 x Director Wildlife 5
Conservation Park Documentary tracking
devices
2. Sinharaja 2 x Camera Nature sound
Forest Reserve crew recording
equipment
1 x Wildlife
Expert
1 x Runner
1.2.1. First Normal Form (1NF)
First Normal Form (1NF) aims to eliminate multi-valued attributes, ensuring that each
intersection of a row and column holds a single value, thereby avoiding repeating groups. To
normalize a relation with repeating groups, the repeating group must be removed and split
into two separate relations. The Primary Key of the new relation is typically a composite key,
combining the original Primary Key with a unique attribute from the new relation, which
helps maintain unique identification.
Table 5 - first normalized table 1

P_ID Client Location Production Staff Properties N.ofDa


Type ys
9 Colombo Galle Face Travel 1 x Director High-definition 4
Tourism Board Green, Documentary camera
Colombo
9 Colombo Gangaramaya Travel 2 x Camera Drone for aerial 4
Tourism Board Temple, Documentary crew shots
Colombo
9 Colombo Gangaramaya Travel 1 x Sound Drone for aerial 4
Tourism Board Temple, Documentary Technician shots
Colombo
9 Colombo Gangaramaya Travel 1 x Runner Drone for aerial 4
Tourism Board Temple, Documentary shots
Colombo
10 Kandy Cultural Temple of the Cultural Film 2 x Camera Traditional 3
Association Tooth, Kandy crew costumes
10 Kandy Cultural Peradeniya Cultural Film 1 x Lighting Lighting 3
Association Botanical Technician equipment
Garden, Kandy
10 Kandy Cultural Peradeniya Cultural Film 1 x Director Lighting 3
Association Botanical equipment
Garden, Kandy
10 Kandy Cultural Peradeniya Cultural Film 2 x Dancers Lighting 3
Association Botanical equipment
Garden, Kandy
11 Wildlife Yala National Wildlife 1 x Director
Wildlife 5
Conservation Park Documentary tracking devices
11 Wildlife Sinharaja Wildlife 2 x Camera
Nature sound 5
Conservation Forest Reserve Documentary crew recording
equipment
11 Wildlife Sinharaja Wildlife 1 x Wildlife Nature sound 5
Conservation Forest Reserve Documentary Expert recording
equipment
11 Wildlife Sinharaja Wildlife 1 x Runner Nature sound 5
Conservation Forest Reserve Documentary recording
equipment

Table 6 - first normalized table 2

P_ID Staff
9 1 x Director
9 2 x Camera crew
9 1 x Sound Technician
9 1 x Runner
10 2 x Camera crew
10 1 x Lighting Technician
10 1 x Director
10 2 x Dancers
11 1 x Director
11 2 x Camera crew
11 1 x Wildlife Expert
11 1 x Runner

1.2.2. Second Normalization Form (2NF)

In order for a table to meet the requirements of the second normal form (2NF), it must first
adhere to the principles of the first normal form (1NF). This means each attribute in the table
must contain only atomic (indivisible) values, ensuring there are no repeating groups.
The essence of the second normal form lies in the structure of the Primary Key. If a table's
Primary Key consists of a single attribute, the table automatically meets the criteria for 2NF.
However, if the Primary Key is composite (composed of multiple attributes), additional
conditions apply. In such cases, every non-key attribute must depend on the entire composite
Primary Key, rather than on just a part of it. This ensures that no partial dependencies exist,
meaning each non-key attribute is functionally dependent on the entire Primary Key and not
just on a subset.
By eliminating partial dependencies, the second normal form aims to reduce redundancy and
improve data integrity within relational databases. This normalization step helps in
organizing data more efficiently, ensuring that each table structure is optimized for storage,
retrieval, and maintenance of data.

Table 7 - second normalized table

P_ID Client Location


Production Staff Properties N.of
Type Days
9 Colombo Galle Face Travel 1 x Director High- 4
Tourism Board Green, Colombo Documentary definition
camera
9 Colombo Gangaramaya Travel 2 x Camera Drone for 4
Tourism Board Temple, Documentary crew aerial shots
Colombo
9 Colombo Gangaramaya Travel 1 x Sound Drone for 4
Tourism Board Temple, Documentary Technician aerial shots
Colombo
9 Colombo Gangaramaya Travel 1 x Runner Drone for 4
Tourism Board Temple, Documentary aerial shots
Colombo
10 Kandy Cultural Temple of the Cultural Film 2 x Camera Traditional 3
Association Tooth, Kandy crew costumes
10 Kandy Cultural Peradeniya Cultural Film 1 x Lighting Lighting 3
Association Botanical Technician equipment
Garden, Kandy
10 Kandy Cultural Peradeniya Cultural Film 1 x Director Lighting 3
Association Botanical equipment
Garden, Kandy
10 Kandy Cultural Peradeniya Cultural Film 2 x Dancers Lighting 3
Association Botanical equipment
Garden, Kandy
11 Wildlife Yala National Wildlife 1 x Director Wildlife 5
Conservation Park Documentary tracking
devices
11 Wildlife Sinharaja Forest Wildlife 2 x Camera Nature sound 5
Conservation Reserve Documentary crew recording
equipment
11 Wildlife Sinharaja Forest Wildlife 1 x Wildlife Nature sound 5
Conservation Reserve Documentary Expert recording
equipment
11 Wildlife Sinharaja Forest Wildlife 1 x Runner Nature sound 5
Conservation Reserve Documentary recording
equipment

Table 8 - second formalized table 2 production

Production ID Properties
9 High-definition camera
9 Drone for aerial shots
10 Traditional costumes
10 Lighting equipment
11 Wildlife tracking devices
11 Nature sound recording equipment

1.2.3. Third Normalization Form

The Third Normal Form (3NF) is a database schema design methodology aimed at reducing
data redundancy, avoiding anomalies, ensuring referential integrity, and simplifying data
management. A table is in 3NF if it meets all the requirements of the Second Normal Form
(2NF) and lacks transitive dependencies among non-prime attributes. In simpler terms, 3NF
ensures that every non-key attribute is directly dependent only on the primary key, thereby
eliminating indirect relationships between non-key attributes.
1.2.4. Wireframes

//TO DO
Need To Attach WireFrames Here
1.2.5. Data Validation
Data validation is the systematic process of ensuring that data is accurate, clean, and reliable.
This involves implementing various verification measures within a system or report to
guarantee the consistency and correctness of both input and stored data. As automated
systems frequently manage data entry with minimal or no human oversight, it is crucial to
verify that the entered information meets predefined quality standards. Accurate data entry is
essential for effective data utilization, as inaccuracies can lead to significant issues in
subsequent reporting and analysis processes. Even accurately entered unstructured data will
incur costs associated with cleaning, converting, and retaining the data.
1.2.6. Importance of Data Validation
The primary goal of data validation is to ensure data integrity and reliability, which are vital
for making informed decisions. High-quality data helps organizations avoid costly errors and
improves operational efficiency. Inaccurate data can lead to flawed insights, misguided
strategies, and ultimately, business losses. Moreover, data validation ensures compliance with
regulatory standards, which is crucial in sectors like healthcare, finance, and legal industries.
By validating data, organizations can maintain the trust of stakeholders and enhance their
overall data governance framework.
1.2.7. Types of Data Validation
There are several types of data validation, each designed to ensure data accuracy before it is
stored in a database. Common data validation checks include:
Data Type Check
This verification ensures that the input data is of the correct type. For instance, a specific field
may only accept numerical values. If this is the case, the system should reject any data that
includes letters or special characters. Ensuring the correct data type prevents errors and
maintains data integrity.
Code Check
Code checking verifies that a field contains a valid value or adheres to specific formatting
rules. For example, validating a postal code can be done by cross-referencing it with a list of
authorized codes. This method can also be applied to other entities, such as country codes and
industry classification codes like NAICS.
Range Check
Range checks ensure that a value falls within a specified range. This is particularly important
for numerical data, where values outside the expected range can indicate errors. For example,
an age field might be restricted to values between 0 and 120. If an entered value falls outside
this range, it indicates an error that needs to be corrected.
Format Check
Some data types must follow a specific format. A common example is date fields, which may
need to be formatted as "YYYY-MM-DD" or "DD-MM-YYYY." Implementing format
checks ensures that data remains consistent and reliable over time. This is especially
important for data analysis and reporting, where uniform data formatting is crucial.
Consistency Check
Consistency checks are logical verifications that ensure data coherency. For example,
verifying that the delivery date of a package is after the shipment date ensures logical
consistency. Such checks are vital for maintaining the integrity of data relationships and
dependencies within a database.
Uniqueness Check
Certain fields, such as identification numbers or email addresses, must be unique. Uniqueness
checks ensure that these fields do not contain duplicate entries, maintaining the
distinctiveness of each record in the database. This is crucial for preventing data redundancy
and ensuring the reliability of the database.
1.2.8. Implementing Data Validation
Implementing data validation requires a multi-faceted approach. First, organizations need to
define clear data validation rules based on their specific needs and industry standards. This
involves identifying critical data fields and establishing the criteria for validating these fields.
Next, automated tools and software can be employed to enforce these validation rules during
data entry and processing. Regular audits and reviews should also be conducted to ensure the
ongoing accuracy and quality of the data. Additionally, providing training and guidelines to
staff involved in data entry and management can help minimize errors and reinforce the
importance of data validation.

Figure 2 : Data Validation tools


1.3. Evaluation of the design in relation to user and system requirements

User Requirements ERD Wire Frame Evaluate


To accommodate various productions,Production, The database
clients, locations, properties, and staff,Client, design meets the
the database should be designedLocation, user's needs by
accordingly. Staff, creating entities
Property //ToDO. for production,
Need to attach client, location,
staff, and property.
This structure
ensures that the
system can handle
various aspects of
the business,
providing a
comprehensive
solution for
managing diverse
data related to
productions and
associated
elements.
The system must be capable of trackingProduction The system
the status of productions, such as effectively tracks
identifying whether they are in pre- the production
production, production, or post- status,
production stages. distinguishing
between pre-
production,
production, and
post-production
stages. This
functionality is
critical for
monitoring and
managing the
workflow of
productions,
ensuring
transparency and
efficiency
throughout the
production
lifecycle.
The database should offer searchProduction, The search
functionality that allows users toClient, functionality is
effortlessly look up productions,Location, robust, enabling
clients, locations, properties, and staffStaff, users to find
types using varying criteria such asProperty productions,
name, location, date, or client. clients, locations,
properties, and
staff using various
criteria like name,
location, date, or
client. This feature
greatly enhances
user experience by
making it easy to
access and retrieve
specific
information
quickly and
efficiently.
The system must have user-friendlyProduction, User-friendly
functionalities that enable hassle-freeClient, functionalities are
creation, modification, and removal ofLocation, in place for
productions, clients, locations,Staff, creating,
properties, and staff types. Property, modifying, and
Clients_Cont deleting records
actNo, related to
Location_Co productions,
ntactNo, clients, locations,
Staff_Contact properties, and
No, staff. These
Production_S features streamline
taff, data management
Production_L processes,
ocation, allowing users to
Location_Pro easily update and
perty maintain the
database without
unnecessary
complications.
The system must implement suitableUserAccount Security measures
security measures to safeguard are implemented
confidential data stored in the through distinct
database. user and admin
account
classifications,
with specific
permissions
assigned to each.
This approach
ensures that
sensitive data is
protected and only
accessible by
authorized
personnel, thereby
maintaining data
confidentiality and
integrity.
The database must monitor theUserAccount The system
permissions and access levels of various rigorously
users, ensuring that confidential monitors user
information can only be accessed by permissions and
authorized individuals. access levels,
ensuring that
confidential
information is
restricted to
authorized users
only. This level of
control is essential
for maintaining
security and
preventing
unauthorized
access to sensitive
data.
The system must enable scheduling of Production_S The creation of the
staff and equipment for eachtaff Production_Staff
production. entity facilitates
the scheduling of
staff and
equipment for
productions. This
scheduling
capability is
crucial for
effective resource
management,
ensuring that the
necessary
personnel and
equipment are
allocated
appropriately to
meet production
needs.
The database must possess a backupBackup/ A robust backup
and recovery mechanism to prevent the Recovery and recovery
loss of data in the occurrence of system mechanism is
failures or other disasters. implemented in the
C# code, ensuring
data protection in
the event of
system failures or
disasters. This
feature is vital for
maintaining data
integrity and
availability,
allowing for swift
recovery and
minimal disruption
in case of data loss
incidents.
Activity 02

LO2 Develop a fully-functional relational database


system, based on an existing system design.

2.1. Data Definition Language (DDL) in SQL

Data Definition Language (DDL) consists of SQL statements that are used to define and
modify the structure of database objects within a database schema. DDL is essential for
creating, altering, and deleting database structures such as tables, indexes, and other database
objects. Unlike Data Manipulation Language (DML), DDL does not deal with the data itself,
but rather with the schema that organizes and manages the data. Typically, DDL commands
are used by database administrators rather than regular users, who interact with the database
through applications.

2.1.1. Key DDL Commands

CREATE

The CREATE command is used to create a new database or any of its objects, such as tables,
indexes, views, stored procedures, and triggers. This command establishes the structure of the
database and its components. For example, creating a new table in a database involves
specifying the table name and its columns along with their data types and constraints.

Example: To create a table named Employees, you would specify the columns such as
EmployeeID, FirstName, LastName, DateOfBirth, Position, and Salary, including their
respective data types and constraints. This establishes the structure for storing employee
records.

DROP

The DROP command is used to delete an existing database object, such as a table, index, or
view. This command removes the object and all the data contained within it from the
database. It is a powerful command that should be used with caution because the deletion is
permanent and cannot be undone.

Example: To delete the Employees table, you would use the DROP TABLE Employees
command. This would permanently remove the table and all its data from the database.

ALTER

The ALTER command modifies an existing database object. It can be used to add, delete, or
modify columns in a table, or to change the properties of database objects. This command is
essential for making structural changes to a database after it has been created.

Example: To add a new column named Email to the Employees table, you would use the
ALTER TABLE Employees ADD Email VARCHAR(100) command. This adds a new
column for storing email addresses.
TRUNCATE

The TRUNCATE command removes all rows from a table without deleting the table itself. It
also reclaims the storage space occupied by the table’s data. This command is faster than
DELETE because it does not generate individual row delete actions and does not fire triggers.

Example: To remove all data from the Employees table, you would use the TRUNCATE
TABLE Employees command. This clears all records while retaining the table structure.

COMMENT

The COMMENT command is used to add descriptive comments to the data dictionary. These
comments help to document the database schema, providing useful information about the
purpose and structure of various database objects.

Example: To add a comment to the Employees table, you would use the COMMENT ON
TABLE Employees IS 'Table containing employee records' command. Similarly, you can add
comments to specific columns for clarity.

RENAME

The RENAME command changes the name of an existing database object. This is useful
when there is a need to update the name of a table, column, or other database object to better
reflect its purpose or to adhere to new naming conventions.

Example: To rename the Employees table to Staff, you would use the ALTER TABLE
Employees RENAME TO Staff command. This updates the table name while retaining its
structure and data.

2.1.2. Creating a Database and Table

To illustrate the use of DDL commands, let’s create a new database and a table within it.

Creating a Database

To create a new database named CompanyDB, you would use the CREATE DATABASE
CompanyDB command. This initializes a new database environment for storing and
managing data.

Using the Database

To switch to the newly created database, you would use the USE CompanyDB command.
This sets the context for subsequent operations within the specified database.

Creating a Table

To create a table named Departments within the CompanyDB database, you would define
columns such as DepartmentID, DepartmentName, ManagerID, and Location. This involves
specifying data types and constraints to establish the table structure.
2.1.3. Modifying the Table

Suppose we need to modify the Departments table to add a new column for the budget. You
would use the ALTER TABLE Departments ADD Budget DECIMAL(15, 2) command to
add the new Budget column, which will store the department’s annual budget.

Deleting the Table

If the Departments table is no longer needed, you can delete it using the DROP TABLE
Departments command. This command removes the table and all its data from the database
permanently.

Using Truncate for Data Removal

To remove all data from the Departments table without deleting the table itself, you would
use the TRUNCATE TABLE Departments command. This clears the table’s data while
retaining its structure.

Adding Comments

To add comments to the Departments table and its columns, you can use the COMMENT
command. For instance, you might use COMMENT ON TABLE Departments IS 'Table
containing department records' and COMMENT ON COLUMN Departments.Budget IS
'Annual budget of the department' to provide descriptive information.

Renaming the Table

If you need to rename the Departments table to Divisions, you would use the ALTER
TABLE Departments RENAME TO Divisions command. This updates the table name to
better reflect its content or purpose.

DDL commands are fundamental for managing the structure of a database. They allow for the
creation, alteration, and deletion of database objects, thereby shaping the schema that
organizes and stores data. By using commands like CREATE, DROP, ALTER, TRUNCATE,
COMMENT, and RENAME, database administrators can effectively control and document
the database structure, ensuring it meets the requirements of the system and its users. This
ensures the integrity, efficiency, and clarity of the database schema, which is crucial for
robust data management and utilization.

Figure 3: DDL and DML Commands


//TODO
Need to attach screenshots of dml commands used in quiet attic films project
2.1.4. Data Manipulation Language (DML) in SQL

Data Manipulation Language (DML) is a subset of SQL statements primarily concerned with
the manipulation of data stored within a database. DML statements allow users to insert,
update, delete, and retrieve data from database tables, playing a crucial role in managing and
interacting with the database’s data. Unlike Data Definition Language (DDL), which deals
with the structure of the database, DML focuses on the actual data.

Key DML Commands

INSERT

The INSERT statement is used to add new rows of data to a specified table. This command
allows for the insertion of single or multiple rows in one statement. It specifies the table into
which data will be inserted and lists the values to be inserted into the corresponding columns.

Example: To add a new employee to the Employees table, you would use the INSERT INTO
Employees (EmployeeID, FirstName, LastName, DateOfBirth, Position, Salary) VALUES
(1, 'John', 'Doe', '1980-01-01', 'Manager', 75000) command. This adds a new row with the
specified data to the Employees table.

UPDATE

The UPDATE statement modifies existing data within a table. This command allows for the
updating of one or more columns in one or more rows, based on specified conditions. The
SET clause specifies the columns to be updated and their new values, while the WHERE
clause identifies the rows to be updated.

Example: To update the salary of an employee with EmployeeID 1, you would use the
UPDATE Employees SET Salary = 80000 WHERE EmployeeID = 1 command. This
changes the salary for the specified employee.

DELETE

The DELETE statement removes rows from a table based on specified conditions. This
command is useful for deleting data that is no longer needed or for purging outdated records
from the database. The WHERE clause specifies which rows should be deleted.

Example: To remove an employee with EmployeeID 1 from the Employees table, you would
use the DELETE FROM Employees WHERE EmployeeID = 1 command. This deletes the
specified row from the table.

SELECT

The SELECT statement retrieves data from one or more tables. It allows users to specify
which columns to retrieve and which rows to return, based on conditions specified in the
WHERE clause. The SELECT statement is fundamental for querying the database and
obtaining information.
Example: To retrieve all employee records from the Employees table, you would use the
SELECT * FROM Employees command. This returns all columns for all rows in the
Employees table.

Importance of DML

DML commands are essential for database operations, enabling users to manage the data
effectively. They allow for the dynamic handling of data, ensuring that it can be inserted,
updated, deleted, and queried as needed. This flexibility is vital for maintaining the accuracy,
relevance, and accessibility of the data within a database.

//TODO
Need to attach screenshots of dml codes used in quiet attic films project
2.1.5. Database and table creation of Quiet Attic Film

SQL query for Quiet Attic Film database and table creation

//ToDo

Need to attach SQL codes for Database creation Screenshots here

You might also like