Project - Report
Project - Report
A PROJECT REPORT
Submitted by
BACHELOR OF ENGINEERING
In
JUNE 2025
ANNA UNIVERSITY : CHENNAI 600 025
BONAFIDE CERTIFICATE
SIGNATURE SIGNATURE
ACKNOWLEDGEMENT
We take this opportunity to express our gratitude to our Founder and chairman
[Link], our Correspondent [Link], for allowing us to take
over this project.
We thank our principal [Link] Jamuna Anand, [Link],Ph.D., who has always
served as an inspiration for us to carry out our responsibilities also providing comfort
zone for doing this project work.
We express our deep sense of gratitude to our esteemed Supervisor, MR. D Raj
Thilak M.E, for constant guidance and cooperation during the project work.
Further thanks to our parents, our family members and friends for their moral support.
BRYAN ALAN A
JEIN JASPHER J
LINO FEDRICK J
ABSTRACT
i
TABLE OF CONTENTS
C. NO TITLE PAGE NO
ABSTRACT i
LIST OF FIGURES iv
LIST OF TABLES v
LIST OF ABBRIVATION vi
1 INTRODUCTION 1
1.1. Overview 1
1.2. Problem Statement 1
1.3. Cloaking 3
1.4. Aim And Objective 5
1.5. Scope Of The Project 6
2 LITERATURE SURVEY 7
2.1 General 7
2.2 Related Work 7
3 SYSTEM ANALYSIS 13
3.1. Existing System 13
3.2. Proposed System 14
4 SYSTEM IMPLEMENTATION 17
4.1. Modules Description 17
4.1.1. Cloud Service Provider Web App 17
4.1.2. End User Interface 17
4.1.3. Cloaking Wall Model 19
4.1.4. Bot Identification And Data Distribution 20
4.1.5. Disguise Data Generator 21
5 SYSTEM REQUIREMENTS 23
5.1. Hardware Requirements 23
5.2. Software Requirements 23
6 SYSTEM DESIGN 24
6.1. System Architecture 24
6.2. Dataflow Diagram 25
6.3. UML Diagram 26
6.3.1. Use Case Diagram 26
6.3.2. Activity Diagram 27
ii
6.3.3. Sequence Diagram 28
6.4. Table Design 29
7 SYSTEM TESTING 35
7.1. Software Testing 35
7.2. Test Cases 37
7.3. Test Report 39
8 CONCLUSION AND FUTURE ENHANCEMENT 43
8.1. Conclusion 43
8.2. Future Enhancement 43
APPENDIX 45
I. Source Code 45
II. Snap shots 57
BIBILIOGRAPHY 65
Web References 66
iii
LIST OF FIGURES
iv
LIST OF TABLES
v
LIST OF ABBREVIATION
vi
CHAPTER 1
INTRODUCTION
1.1. OVERVIEW
1
our constant attention to considerations around information security; all channels,
all devices, all the time. Cloud security considerations span a range of concerns;
resource connectivity, user entitlements, data loss prevention,
transitory/stationary data handling and encryption policies, data security
classification restrictions, cross-border information flow...the list goes on. It is
not uncommon for information security considerations to run counter to cloud
solution patterns. Without clear information security guidelines articulated in the
Cloud Strategy, it is unlikely that the organization will be well protected from
security risks in the cloud.
2
Protecting customer data is crucial for building trust for the service providers.
Tradition network security and cloud security differ in certain aspects. The access
in traditional network security is controlled using a perimeter model. Due to the
large scale and flexibility of the cloud environment, implementing security
solutions in the cloud environment face numerous challenges. Since the data on
the cloud is accessed outside the corporate networks on numerous occasions
maintaining a record of the data access is difficult. In essence, the problem
revolves around the necessity for an advanced and comprehensive approach to
cloud data storage security. The proposed Cloaking Wall Model aims to provide
a holistic solution by addressing these concerns, offering persistent
confidentiality, global consistency, timed access controls, and location-sensitive
protection.
1.3. CLOAKING
3
CAMOUFLAGE DATA DISGUISE
ChaCha20
4
Fig 1.4. ChaCha20
Aim
The aim of the project is to establish an advanced cloud data security
framework by designing a Cloaking Wall Model integrated with camouflage
techniques. This framework seeks to enhance privacy and access control for
sensitive data stored in cloud environments.
Objectives
To Develop a Robust Cloaking Wall Model: Create a secure foundation for
advanced cloud data security.
To Integrate Camouflage Techniques: Implement Camouflage Data
Disguise for enhanced privacy.
To Ensure Persistent Confidentiality: Fortify data security to prevent
unauthorized access.
5
To Achieve Global Consistency and Access Control: Establish
mechanisms for timed access control and global consistency.
To Implement Location-Sensitive Data Protection: Introduce measures for
enhanced data protection based on user geography.
To Develop Bot Identification and Targeted Content Distribution: Create a
mechanism for bot identification and selective content distribution.
To Address Cross-Organizational Data Sharing Challenges: Provide a
robust solution for secure cross-organizational data access.
To Reduce Workloads of Certificate Management: Implement measures to
streamline certificate management.
To Simplify Key Management: Develop strategies for key management
simplification while ensuring security.
6
CHAPTER 2
LITERATURE SURVEY
2.1 GENERAL
Shota Fujii, Takayuki Sato, and Sho Aoki [1], addresses the growing
challenge of detecting cloaking techniques used by malicious hosts to evade
security mechanisms. Focusing on geofencing and time-based cloaking, the
authors developed an active monitoring system called Stargazer, which enables
long-term, multiregional surveillance of malicious domains. By observing 18,397
malicious hosts over a two-year period, the study uncovers how attackers
strategically hide their activities based on geographic location and time,
effectively bypassing traditional, single-point monitoring methods. Stargazer’s
broad data collection approach enhances the visibility of evasive behaviors that
are otherwise missed in short-term or localized analyses. While the paper makes
a significant contribution by revealing the persistence and adaptability of
cloaking strategies, it lacks detailed disclosure of the specific algorithms or
detection techniques employed, which may limit reproducibility and further
advancement.
Caixia Zhang and Zijian Pan [2], explores the pressing issues of data
security and user privacy in cloud-based systems, proposing an innovative
solution that integrates a logit link function with a longitudinal joint learning
framework for the gamma regression model. To further enhance data integrity
7
and traceability, the study leverages semantic web and blockchain technologies,
establishing a distributed, credit-guaranteed traceability mechanism for product
quality and safety throughout the supply chain. A concept verification system is
designed to ensure data accuracy and trust at each stage, promoting
interoperability across diverse systems via standardized ontologies and smart
contracts. While the proposed framework presents a promising direction for
secure, privacy-preserving data management in cloud environments, the study
lacks clarity on real-world implementation results, scalability, and dataset
specifics, which may limit its practical applicability and generalizability across
broader domains.
8
increase in data protection concerns, the authors analyze a wide range of
techniques—spanning cryptography, access control, watermarking, machine
learning, differential privacy, and probabilistic models—to evaluate their
functionality, scope, and effectiveness. The paper provides an in-depth
comparison of these approaches, identifies existing research gaps, and offers
guidance for future research directions. It underscores the importance of
integrating multiple security techniques to achieve robust and scalable data
protection. However, the study does not delve into specific algorithmic
implementations or datasets, which may hinder the assessment of practical
feasibility and limit its use as a technical blueprint for deployment in real-world
cloud environments.
Sajid Habib Gill, Mirza Abdur Razzaq, and Muneer Ahmad [5],
explores the critical security and privacy issues in cloud computing, with a
focused lens on smart campus environments. Through a detailed case study, the
authors examine threats such as data breaches, access control vulnerabilities,
cyber-attacks, and data availability concerns, emphasizing the urgent need for
robust security measures in educational and institutional cloud-based systems.
The study highlights the potential of blockchain technology as a transformative
solution for enhancing data integrity, reliability, and privacy in the cloud. By
anchoring the research in a real-world scenario, the paper provides practical
insights into the challenges and possible mitigations specific to smart campuses.
However, the lack of in-depth technical details, such as the specific
implementation of blockchain or any underlying datasets, limits the replicability
and technical validation of the findings, and may reduce its effectiveness as a
guide for deployment in similar environments.
9
(MPCC) scheme. By leveraging compressive sensing (CS), the proposed MPCC
framework allows for efficient data compression, encryption, and recovery of
sparse signals, thereby reducing the computational burden on resource-
constrained IoT devices. The scheme supports multi-level privacy, with three
distinct variants tailored for different applications, including smart meter
statistics, electrocardiogram (ECG) signal anonymization, and image protection.
Unlike traditional encryption methods that are computationally intensive and ill-
suited for IoT sensors, MPCC enables secure and efficient processing while
minimizing transmission and storage costs. Theoretical security analyses
demonstrate the robustness of the scheme against ciphertext-only attacks.
10
manual and inefficient due to fragmented and non-machine-readable regulations,
the authors design a system that captures data threats, security controls, and
compliance requirements across multiple jurisdictions. This integrated
knowledge graph enables automated reasoning and supports cloud service
providers in aligning their policies with global [Link] the approach
offers a significant step toward automated cloud compliance, the paper lacks
detailed algorithmic descriptions and provides minimal insight into the dataset
used, limiting its technical reproducibility and potential expansion beyond
security-focused compliance into broader IT governance models.
Hongbo Li and Qiong Huang [10], addresses the critical issue of data
privacy in cloud storage by proposing a novel cryptographic solution called
Identity-Based Encryption with Equality Test supporting Flexible Authorization
(IBEET-FA). This scheme allows authorized users to test whether two ciphertexts
encrypted under different keys contain the same underlying message, enabling
efficient searching and comparison of encrypted data without exposing the
11
plaintext. Built upon bilinear pairing, the methodology tackles the complexity of
performing equality tests in an identity-based encryption environment while
providing fine-grained authorization control over who can conduct these tests.
IBEET-FA helps to accelerate secure data sharing among groups and reduces the
key management challenges typical of traditional public key infrastructure.
However, the reliance on bilinear pairing introduces computational overhead, and
the authors acknowledge the need for future work to develop schemes that avoid
this expensive operation, pointing to a limitation in scalability and efficiency of
the current approach.
12
CHAPTER 3
SYSTEM ANALYSIS
13
Disadvantages
Time-Based Cloaking
Time-based cloaking introduces temporal restrictions on data access,
allowing organizations to define specific time windows during which data can be
accessed. This method enhances security by limiting access to predefined
timeframes, reducing the exposure of data to potential threats. Time-based
cloaking adds an additional layer of control to access patterns, contributing to a
more secure cloud data storage environment.
15
Geolocation-Based Cloaking
Geolocation-Based Cloaking tailors data protection based on a user's
physical location, adding a location-sensitive security layer. This approach
ensures access to sensitive data is granted only from approved geographic areas,
making it ideal for globally dispersed [Link], these four methods
form the Cloaking Wall Model, enhancing confidentiality, consistency, timed
access, and location-aware protection.
Camouflage Data Disguise
Camouflage Data Disguise technique represents an advanced
cryptographic approach that seamlessly integrates Chaffing and Winnowing with
the formidable ChaCha20 encryption algorithm. This technique is strategically
designed to provide disguised data as a countermeasure against unauthorized
access, targeting both unpermitted users and potentially malicious bots.
Advantages
16
CHAPTER 4
SYSTEM IMPLEMENTATION
17
Admin or Data Owner Interface
Login Module
login module provides a secure authentication process for Admins or Data
Owners, ensuring only authorized access to the cloud management interface.
Add and Manage Data
Admins can add, organize, and manage data within the cloud storage. This
module allows them to upload, categorize, and control access to various datasets.
It also supports version control and data updates, ensuring consistency and
traceability across modifications.
Add and Manage Users
Admins have the capability to add new users to the system, defining their
roles and permissions. This module also allows them to modify or revoke access
as needed. It includes activity monitoring features to track user behavior and
ensure compliance with security policies.
Provide Login Credentials to Users
Admins can generate and distribute login credentials for users added to the
system. This ensures a secure onboarding process for new users.
Set Access Policy using Cloaking Wall Model
Leveraging the Cloaking Wall Model, this module enables Admins set
access policies. Admins can define Long-Term Cloaking, Multi-Region based
Cloaking, Time-based Cloaking, and Geolocation-based Cloaking to enhance
data security.
Monitoring Data Access
Admins can monitor and audit data access patterns using this module. It
provides insights into who accessed specific data, when, and from which location,
contributing to overall security and compliance. This feature helps detect
unauthorized access, enforce access policies, and maintain a detailed audit trail
for accountability.
18
Data User Interface
Login Module
Similar to the Admin interface, the login module provides secure
authentication for Data Users, ensuring that only authorized individuals can
access the cloud resources.
Access Data
Data Users can use this module to access the data allocated to them. The
interface provides a user-friendly environment for retrieving, modifying, or
analysing data based on their permissions.
Monitoring Data Access
Data Users have limited access to monitoring tools to track their own data
access. This module allows them to review their activity and ensures transparency
in usage.
19
Time-Based Cloaking Module
Time-Based Cloaking empowers Admins to set temporal restrictions on
data access. This module enhances security by allowing the definition of specific
time windows during which data can be accessed, adding an extra layer of control
over temporal access patterns. It also helps mitigate risks from unauthorized
access during off-hours or non-business periods.
Geolocation-Based Cloaking Module
The Geolocation-Based Cloaking module tailor’s data protection based on
user location. Admins can define security policies that vary depending on the
physical location of Data Users, adding a location-sensitive layer to access
controls.
20
Anomalous Access Timing
Bots often operate on predefined schedules or exhibit unnatural timing
patterns. The mechanism detects anomalous access timings that do not align with
the specified time-based access policies. This helps identify bots attempting to
access data outside permissible time windows.
Geolocation Inconsistencies
The mechanism evaluates the geolocation of incoming requests in
comparison to the Geolocation-Based Cloaking policies. If there are
inconsistencies, such as requests originating from unexpected or restricted
locations, it signals potential bot activity.
Rapid, Repetitive Access Attempts
Bots typically attempt to access data rapidly and repetitively, following
scripted sequences. The mechanism identifies patterns associated with rapid,
repetitive access attempts, flagging entities that display such behaviour for further
scrurity.
The Malicious Data Generator Module, tailored for the Injection of Non-
Compliant Data specifically targeted at users violating the access policy set by
the admin, is a security component designed to simulate and generate data
instances that intentionally breach established policies within the Cloud
Consumer Web App.
21
Chaffing and Camouflage Process:
The module orchestrates a two-fold process: Chaffing, which adds decoy
or chaff data, and Camouflage, which further disguises non-compliant data
through additional obfuscation techniques. This combined approach ensures a
multi-layered defense against unauthorized access. It also helps mislead
malicious entities by obscuring the real data patterns, making it significantly
harder to distinguish valuable information from noise.
ChaCha20 Encryption:
Genuine, chaff, and camouflaged data undergo encryption using the
ChaCha20 algorithm. ChaCha20's strength in providing a secure and efficient
encryption process contributes significantly to safeguarding the confidentiality of
the disguised information.
Winnowing Process:
At the recipient's end, the winnowing process disentangles the genuine data
from the chaff and camouflage layers. The ChaCha20 decryption algorithm,
combined with the appropriate key, unveils the original, non-compliant data. At
the recipient's end, the winnowing process disentangles the genuine data from the
chaff and camouflage layers. The ChaCha20 decryption algorithm, combined
with the appropriate key, unveils the original, non-compliant data. This ensures
that only authorized users can reconstruct the intended information, preserving
data confidentiality. The process adds an extra layer of obfuscation, making it
extremely difficult for unauthorized entities to interpret or misuse intercepted
data.
22
CHAPTER 5
SYSTEM REQUIREMENTS
23
CHAPTER 6
SYSTEM DESIGN
Data Concealment
Login Model
Data Owner
Configure Data Location based Cloaking
Concealment
Access Monitoring Time Based Cloaking
Cloaking Area
Data User
24
6.2. DATA FLOW DIAGRAM
25
6.3. UML DIAGRAM
26
6.3.2. ACTIVITY DIAGRAM
27
6.3.3. SEQUENCE DIAGRAM
28
6.4. TABLE DESIGN
Admin Login
[Link] Field Data Type Field Constraint Description
size
1 username Varchar 20 Null Admin
Username
2 password Varchar 20 Null Admin Password
29
The Data Owner Register module is designed to securely capture and
manage the registration details of data owners, ensuring proper identity
verification and access control. It includes key fields such as contact information,
authentication credentials, and approval status to facilitate secure interactions
within the cloud system.
The table structure is designed to store and manage basic information about
users. There are several string fields, with constraints on length, and references
(foreign key and primary key) to ensure data integrity. The dob field is stored as
a string, which might need conversion to a date type for better handling of dates.
Additionally, there seems to be a small issue with the mobile field type (Bidint),
which should be checked and corrected.
30
CC: File Uploaded Details
[Link] Field Data Type Field Constraint Description
size
1 File id Int(11) 11 Primary File Id
Key
2 Owner id Varchar(20) 20 Foreign key Owner id
3 File Varchar(20) 20 Null File description
description
4 File path Varchar(20) 20 Null File path
5 File size Varchar(20) 20 Null File size
6 Upload date timestamp Null Null Upload date
The File Uploaded Details module records metadata for each file
uploaded by data owners, ensuring traceability and efficient file management. It
includes information such as file ID, owner ID, file path, and upload timestamp
to support secure storage and retrieval in the cloud environment.
31
CC: Geolocation Wall Model
[Link] Field Data Type Field Constraint Description
size
1 id Int(11) 11 Null Unique Id
2 Owner id Varchar(20) 20 Foreign key Owner id
3 User id Varchar(20) 20 Null User id
4 location Varchar(30) 30 Null File access
5 Geo location Varchar(100) 100 Null Geo location
path date
6 File id Int(11) 11 Null File id
7 Shared date Time stamp Null Null File share date
This model supports time-restricted file sharing and access logging, which
is particularly useful in systems where security and controlled data visibility are
priorities. It can be integrated into cloud data security models or time-sensitive
collaboration platforms.
32
CC: Region Wall Model
[Link] Field Data Type Field Constraint Description
size
1 id Int(11) 11 Null Unique Id
2 Owner id Varchar(20) 20 Foreign key Owner id
3 User id Varchar(20) 20 Null User id
4 Region Varchar(30) 30 Null Region Name
5 Location Varchar(100) 100 Null Location
6 File id Int(11) 11 Null File id
7 Shared date Time stamp Null Null File share date
33
CC: Unauthorised Access Log
[Link] Field Data Type Field Constraint Description
size
1 id Int(11) 11 Null Unique Id
2 User id Varchar(20) 20 Foreign key User id
3 Download Varchar(20) 20 Null Download
mode mode id
4 Download Int(11) 11 Null File
file id downloaded id
5 Date time timestamp Null Null File shared date
34
CHAPTER 7
SYSTEM TESTING
System testing for the project would encompass various types of testing to
ensure the robustness, functionality, and security of the system. Here are some
key types of testing that would be relevant:
Functional Testing
This type of testing verifies that each function of the system operates in
accordance with the requirements specified in the design documents. It includes
testing features like user authentication, data management, access control
policies, and monitoring capabilities.
Integration Testing
Integration testing ensures that individual components of the system work
together seamlessly as a whole. It validates interactions between different
modules, APIs, and external systems, including the integration of the Cloaking
Wall Model with the Cloud Consumer Web App.
Performance Testing
Performance testing assesses the responsiveness, scalability, and stability of
the system under various load conditions. It includes testing the system's ability
to handle concurrent users, process requests efficiently, and maintain acceptable
response times.
Security Testing
Security testing evaluates the system's resilience against potential security
threats and vulnerabilities. It includes testing for authentication mechanisms,
encryption protocols, access control measures, data masking techniques.
35
Usability Testing
Usability testing focuses on assessing the system's user interface (UI) design,
navigation flow, and overall user experience. It involves gathering feedback from
end-users to identify any areas for improvement in terms of user interaction and
interface design.
Compatibility Testing
It verifies that the application is compatible with popular web browsers,
mobile devices, and screen resolutions, ensuring a consistent user experience
across diverse environments.
Regression Testing
Regression testing validates that recent code changes or enhancements do not
introduce new defects or regressions in existing functionality. It involves retesting
previously validated features and conducting automated regression test suites to
ensure that the system remains stable and reliable after updates.
Load Testing
Load testing evaluates the system's performance under expected and peak load
conditions. It involves simulating a high volume of concurrent users or data
requests to assess how the system handles stress, scalability, and resource
utilization.
Stress Testing
Stress testing assesses the system's behavior under extreme conditions beyond
its normal operational capacity. It involves pushing the areas of weakness under
heavy load, unexpected inputs, or adverse environmental conditions.
Data Integrity Testing
Data integrity testing verifies the accuracy, consistency, and reliability of data
stored and processed by the system. It includes testing data validation rules,
data manipulation operations, and data integrity constraints to ensure that data
remains intact and error-free throughout its lifecycle.
36
7.2. TEST CASES
User Authentication
Test Case ID: UA_TC_001
Input: Valid username and password with correct multi-factor authentication.
Expected Result: Successful authentication, granting access to the system.
Actual Result: User successfully authenticated, and access granted.
Status: Pass
Dashboard Access
Test Case ID: DA_TC_001
Input: Successful authentication credentials.
Expected Result: Access to the dashboard module.
Actual Result: User gained access to the dashboard.
Status: Pass
37
Data Management by Admins
Test Case ID: DMA_TC_001
Input: Admin uploading and managing data.
Expected Result: Successful organization and control of data.
Actual Result: Admin successfully managed and organized data.
Status: Pass
38
Bot Identification Mechanism
Test Case ID: BIM_TC_001
Input: Monitoring user access patterns.
Expected Result: Accurate identification of potential bot activity.
Actual Result: Bot Identification Mechanism accurately identified potential
bot activity.
Status: Pass
Introduction
Test Objective
The primary objective of the testing phase was to evaluate the performance,
reliability, and functionality of the Advanced Cloud Data Security System.
Specific goals included validating user authentication, assessing data access
controls, and ensuring the successful integration of security features.
39
To verify the functionality of user authentication and authorization
mechanisms.
To validate the implementation of access control policies based on the
Cloaking Wall Model.
To assess the system's ability to manage data securely and enforce data access
policies effectively.
To evaluate the system's resilience against security threats and vulnerabilities.
To ensure that the system performs reliably under different scenarios and
workloads.
Test Scope
The testing scope covered various modules within the system, including user
authentication, dashboard access, Cloaking Wall Model integration, access policy
configuration, data management, monitoring, and security features such as bot
identification and disguise data generation.
Testing of user authentication, including login, registration, and multi-factor
authentication.
Testing of access control mechanisms, such as role-based access control and
policy enforcement.
Testing of data management functionalities, including data upload, storage,
retrieval, and deletion.
Testing of security features, such as encryption, data masking, and intrusion
detection.
Performance testing to assess system responsiveness, scalability, and resource
utilization.
40
Test Environment
The testing environment was set up with the following components:
Hardware
Processor: Intel Core i5-9400F CPU @ 2.90GHz
RAM: 8GB DDR4
Storage: 256GB SSD
Network Interface Card: Gigabit Ethernet
Software
Operating System: Windows 10 Home
Web Browser: Google Chrome, Mozilla Firefox
Database Management System: MySQL 8.0
Web Server: WampServer 3.2.0
Test Result
The following table outlines the results of each test case conducted during the
testing phase:
User Authentication: Successful authentication and access granted.
Access Control: Access policies enforced based on Cloaking Wall Model.
Data Management: Secure and efficient data management operations.
Security Testing: Resilience against security threats and vulnerabilities.
Performance Testing: Reliable performance under different scenarios.
Bug Report
A bug report is a document that details issues, defects, or unexpected
behavior encountered in software during testing or usage. It typically includes
information about the problem, steps to reproduce it, and any relevant system
configurations. Bug reports are essential for developers to identify and fix issues
in the software.
41
BID TCID Bug Description Status Output
Test Conclusion
42
CHAPTER 8
8.1. Conclusion
The future evolution of the system holds exciting possibilities, with key
areas of focus. Integrating machine learning algorithms stands out as a potential
enhancement, enabling dynamic analysis of access patterns to adeptly respond to
43
evolving security threats. Behavioural analytics is another avenue, offering a
nuanced understanding of user behaviour to distinguish normal activities from
potential risks. Additionally, exploring blockchain integration is on the horizon,
aiming to enhance data integrity and transparency by leveraging the decentralized
and tamper-resistant nature of blockchain technology. These enhancements
collectively propel the system towards a more adaptive, context-aware, and
secure future. Future enhancements may also include automated incident
response systems, where predefined workflows are triggered upon detection of
suspicious [Link] with threat intelligence platforms will allow the
Cloaking Wall Model to stay updated on the latest vulnerabilities and attack
vectors, ensuring proactive defense mechanisms.
44
APPENDIX I: SOURCE CODE
Packages
import os
import base64
from [Link] import default_backend
from [Link] import hashes
from [Link].pbkdf2 import PBKDF2HMAC
from [Link] import Fernet
from Crypto import Random
from flask import Flask, render_template, Response, redirect, request, session,
abort, url_for
import [Link]
import hashlib
import shutil
from datetime import date
import datetime
import math
from random import randint
from flask_mail import Mail, Message
from flask import send_file
Database Connection
mydb = [Link](
host="localhost",
user="root",
password="",
45
charset="utf8",
database="cloud_cloaking"
Login
def login():
msg=""
if [Link]=='POST':
uname=[Link]['uname']
pwd=[Link]['pass']
cursor = [Link]()
[Link]('SELECT * FROM data_owner WHERE owner_id = %s AND
password = %s && status=1', (uname, pwd))
account = [Link]()
if account:
session['username'] = uname
return redirect(url_for('upload'))
else:
msg = 'Incorrect username/password!'
Data Owner Registration
def register():
msg=""
mycursor = [Link]()
[Link]("SELECT max(id)+1 FROM data_owner")
maxid = [Link]()[0]
now = [Link]()
rdate=[Link]("%d-%m-%Y")
if maxid is None:
maxid=1
if [Link]=='POST':
name=[Link]['name']
46
mobile=[Link]['mobile']
email=[Link]['email']
city=[Link]['city']
uname=[Link]['uname']
pass1=[Link]['pass']
cursor = [Link]()
[Link]('SELECT count(*) FROM data_owner WHERE owner_id = %s
', (uname,))
cnt = [Link]()[0]
if cnt==0:
sql = "INSERT INTO
data_owner(id,name,mobile,email,city,owner_id,password,reg_date) VALUES
(%s, %s, %s, %s, %s, %s, %s, %s)"
val = (maxid,name,mobile,email,city,uname,pass1,rdate)
[Link](sql, val)
[Link]()
print([Link], "Registered Success")
msg="success"
else:
msg='fail'
Upload Files
def upload():
msg=""
act=""
if 'username' in session:
uname = session['username']
mycursor = [Link]()
[Link]('SELECT * FROM data_owner where owner_id=%s',(uname,
))
47
rr=[Link]()
name=rr[1]
now = [Link]()
rdate=[Link]("%d-%m-%Y")
rtime=[Link]("%H:%M")
if [Link]=='POST':
description=[Link]['description']
[Link]("SELECT max(id)+1 FROM data_files")
maxid = [Link]()[0]
if maxid is None:
maxid=1
if 'file' not in [Link]:
flash('No file part')
return redirect([Link])
file = [Link]['file']
file_type = file.content_type
if [Link] == '':
flash('No selected file')
return redirect([Link])
if file:
fname = "F"+str(maxid)+[Link]
filename = secure_filename(fname)
[Link]([Link]([Link]['UPLOAD_FOLDER'], filename))
bsize=[Link]("static/upload/"+filename)
fsize=bsize/1024
file_size=round(fsize,2)
ff=[Link]('.')
i=0
file_ext=''
48
for fimg in imgext:
if fimg==ff[1]:
file_ext=img[i]
break
else:
file_ext=img[0]
i+=1
sql = "INSERT INTO
data_files(id,owner_id,description,file_name,file_type,file_size,reg_date,reg_ti
me,file_extension) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)"
val = (maxid,uname,description,filename,file_type,file_size,rdate,rtime,file_ext)
[Link](sql,val)
[Link]()
msg="success"
Share Files
File_id=[Link]("file_id")
uname=""
msg=""
act = [Link]('act')
if 'username' in session:
uname = session['username']
mycursor = [Link]()
[Link]("SELECT * FROM data_owner where
owner_id=%s",(uname,))
value = [Link]()
name=value[1]
now = [Link]()
rdate=[Link]("%d-%m-%Y")
49
[Link]("SELECT * FROM data_user where
owner_id=%s",(uname,))
udata = [Link]()
[Link]("SELECT count(*) FROM data_user where
owner_id=%s",(uname,))
ucnt = [Link]()[0]
[Link]("SELECT * FROM data_files where id=%s",(fid,))
fdata = [Link]()
fname=fdata[3]
if [Link]=='POST':
selected_users=[Link]('uu[]')
for u1 in selected_users:
[Link]("SELECT max(id)+1 FROM data_share")
maxid = [Link]()[0]
if maxid is None:
maxid=1
sql = "INSERT INTO data_share(id, owner_id, file_id, username, share_type,
share_date) VALUES (%s, %s, %s, %s, %s, %s)"
val = (maxid, uname, file_id, u1, '1', rdate)
act="success"
[Link](sql, val)
[Link]()
ChaCha20 Encryption
import struct
def yield_chacha20_xor_stream(key, iv, position=0):
"""Generate the xor stream with the ChaCha20 cipher."""
if not isinstance(position, int):
raise TypeError
if position & ~0xffffffff:
50
raise ValueError('Position is not uint32.')
if not isinstance(key, bytes):
raise TypeError
if not isinstance(iv, bytes):
raise TypeError
if len(key) != 32:
raise ValueError
if len(iv) != 8:
raise ValueError
def rotate(v, c):
return ((v << c) & 0xffffffff) | v >> (32 - c)
def quarter_round(x, a, b, c, d):
x[a] = (x[a] + x[b]) & 0xffffffff
x[d] = rotate(x[d] ^ x[a], 16)
x[c] = (x[c] + x[d]) & 0xffffffff
x[b] = rotate(x[b] ^ x[c], 12)
x[a] = (x[a] + x[b]) & 0xffffffff
x[d] = rotate(x[d] ^ x[a], 8)
x[c] = (x[c] + x[d]) & 0xffffffff
x[b] = rotate(x[b] ^ x[c], 7)
ctx = [0] * 16
ctx[:4] = (1634760805, 857760878, 2036477234, 1797285236)
ctx[4 : 12] = [Link]('<8L', key)
ctx[12] = ctx[13] = position
ctx[14 : 16] = [Link]('<LL', iv)
while 1:
x = list(ctx)
for i in range(10):
quarter_round(x, 0, 4, 8, 12)
51
quarter_round(x, 1, 5, 9, 13)
quarter_round(x, 2, 6, 10, 14)
quarter_round(x, 3, 7, 11, 15)
quarter_round(x, 0, 5, 10, 15)
quarter_round(x, 1, 6, 11, 12)
quarter_round(x, 2, 7, 8, 13)
quarter_round(x, 3, 4, 9, 14)
for c in [Link]('<16L', *(
(x[i] + ctx[i]) & 0xffffffff for i in range(16))):
yield c
ctx[12] = (ctx[12] + 1) & 0xffffffff
if ctx[12] == 0:
ctx[13] = (ctx[13] + 1) & 0xffffffff
def chacha20_encrypt(data, key, iv=None, position=0):
"""Encrypt (or decrypt) with the ChaCha20 cipher."""
if not isinstance(data, bytes):
raise TypeError
if iv is None:
iv = b'\0' * 8
if isinstance(key, bytes):
if not key:
raise ValueError('Key is empty.')
if len(key) < 32:
# TODO(pts): Do key derivation with PBKDF2 or something similar.
key = (key * (32 // len(key) + 1))[:32]
if len(key) > 32:
raise ValueError('Key too long.')
return bytes(a ^ b for a, b in
zip(data, yield_chacha20_xor_stream(key, iv, position)))
52
assert chacha20_encrypt(
b'Hello World', b'chacha20!') == b'\xeb\xe78\xad\xd5\xab\x18R\xe2O~'
assert chacha20_encrypt(
b'\xeb\xe78\xad\xd5\xab\x18R\xe2O~', b'chacha20!') == b'Hello World'
def run_tests():
import binascii
uh = lambda x: [Link](bytes(x, 'ascii'))
for i, (ciphertext, key, iv) in enumerate((
(uh('76b8e0ada0f13d90405d6ae55386bd28bdd219b8a08ded1aa836efcc8b770d
c7da41597c5157488d7724e03fb8d84a376a43b8f41518a11cc387b669'),
uh('0000000000000000000000000000000000000000000000000000000000000
000'), uh('0000000000000000')),
assert chacha20_encrypt(b'\0' * len(ciphertext), key, iv) == ciphertext
print('Test %d OK.' % i)
Set Geo Location
Def shrea_geolocation:
[Link]('SELECT * FROM data_user where username=%s',(uname,
))
rr=[Link]()
name=rr[1]
owner=rr[2]
ff=open("static/[Link]","r")
loc=[Link]()
[Link]()
[Link]("SELECT count(*) FROM data_files f,data_share s where
[Link]=[Link] && [Link]=%s",(uname,))
c1 = [Link]()[0]
if c1>0:
53
[Link]("SELECT * FROM data_files f,data_share s where [Link]=[Link]
&& [Link]=%s",(uname,))
dat = [Link]()
for d1 in dat:
status=''
if d1[13]==1:
status='1'
if d1[13]==2:
lat1=[Link]('.')
lt1=lat1[0]
lt11=lat1[1]
lt2=lt11[0:4]
lon1=[Link]('.')
lo1=lon1[0]
lo2=lon1[1]
[Link]("SELECT * FROM share_location where
share_id=%s",(d1[9],))
d33 = [Link]()
for d3 in d33:
[Link]("SELECT * FROM geo_location where id=%s",(d3[4],))
d4 = [Link]()
g1=d4[2]
geo_location=[Link]('new [Link](')
g21=''.join(geo_location)
g22=[Link]('), ')
g23='-'.join(g22)
g24=[Link]('-')
gn=len(g24)-1
i=0
54
while i<gn:
f1=[Link]('.')
geo1=f1[0]
f2=f1[1]
f3=f2[0:4]
[Link](f3)
Set Time and Date
def share_time:
date_st=''
time_st=''
days_st=''
#between date
sdate=d1[15]
edate=d1[16]
sd1=[Link]('-')
ed1=[Link]('-')
import datetime
sdd = [Link](int(sd1[2]), int(sd1[1]),int(sd1[0]))
cdd = [Link](int(cd1[2]), int(cd1[1]),int(cd1[0]))
edd = [Link](int(ed1[2]), int(ed1[1]),int(ed1[0]))
if sdd<cdd<edd:
date_st='1'
else:
date_st='1'
#days
dys=d1[19]
dy=[Link](',')
x=0
from datetime import datetime
55
dty = [Link]()
ddy=[Link]('%A')
ddr=['Sunday','Monday','Tuesday','Wednesday','Thursday','Friday','Saturday']
i=0
for ddr1 in ddr:
i+=1
if ddr1==ddy:
break
cdy=str(i)
for dy1 in dy:
if cdy==dy1:
x+=1
if x>0:
days_st='1'
def file_download():
fid = [Link]('fid')
mycursor = [Link]()
[Link]("SELECT * FROM data_files where id=%s",(fid,))
value = [Link]()
path="static/upload/"+value[3]
return send_file(path, as_attachment=True)
56
APPENDIX II: SNAP SHORTS
Overview
User Interface (UI) design plays a critical role in the adoption and usability
of the “A Privacy-Preserving Framework for Cloud Data Access Using the Data
Concealment Model” system.
Home Page
Introduce A Privacy-Preserving Framework for Cloud Data Access Using
the Data Concealment Model and guide users to the correct flows based on their
roles. The home page can be viewed in Fig .9.1
These indicate that the system supports multiple user roles, each likely having
different privileges and views
Data Owner: Uploads and manages cloud data.
Data User: Requests and accesses data.
Admin: Manages users, permissions, and monitors the system.
57
Fig: 9.1 Home Page
The Data Owner Login Page allows registered data owners to securely
access their account to upload, encrypt, and manage data in the cloud. that can be
viewed in the fig.9.5 Data Owner Login Page.
59
Fig:9.5 Data Owner Login Page
User Creation
60
File Upload Section
61
Fig.9.9 Selecting The Cloaking Method
Selecting Location
62
Fig:9.11 Chosen Time Based Cloaking
63
Fig:9.13 Viewing Datas of Owner
Concealed Data
64
BIBILIOGRAPHY
1. J. Gao, H. Yu, X. Zhu and X. Li, "Blockchain-based digital rights
management scheme via multiauthority ciphertext-policy attribute-based
encryption and proxy re-encryption", IEEE Syst. J., vol. 15, no. 4, pp.
5233-5244, Dec. 2021.
2. J. Sun, D. Chen, N. Zhang, G. Xu, M. Tang, X. Nie, et al., "A privacy-
aware and traceable fine-grained data delivery system in cloud-assisted
healthcare IIoT", IEEE Internet Things J., vol. 8, no. 12, pp. 10034-10046,
Jun. 2021.
3. P. Patil and M. Sangeetha, "Blockchain-based decentralized KYC
verification framework for banks", Proc. Comput. Sci., vol. 215, pp. 529-
536, Jan. 2022.
4. P. Sanchol, S. Fugkeaw and H. Sato, "A mobile cloud-based access control
with efficiently outsourced decryption", Proc. 10th IEEE Int. Conf. Mobile
Cloud Comput. Services Eng. (MobileCloud), pp. 1-8, Aug. 2022.
65
5. S. Fugkeaw, "A lightweight policy update scheme for outsourced personal
health records sharing", IEEE Access, vol. 9, pp. 54862-54871, 2021.
6. S. Qi, W. Wei, J. Wang, S. Sun, L. Rutkowski, T. Huang, et al., "Secure
data deduplication with dynamic access control for mobile cloud
storage", IEEE Trans. Mobile Comput., pp. 1-18, 2023.
7. S. Wang, H. Wang, J. Li, H. Wang, J. Chaudhry, M. Alazab, et al., "A fast
CP-ABE system for cyber-physical security and privacy in mobile
healthcare network", IEEE Trans. Ind. Appl., vol. 56, no. 4, pp. 4467-4477,
Jul. 2020.
8. X. Li, T. Liu, C. Chen, Q. Cheng, X. Zhang and N. Kumar, "A lightweight
and verifiable access control scheme with constant size ciphertext in edge-
computing-assisted IoT", IEEE Internet Things J., vol. 9, no. 19, pp.
19227-19237, Oct. 2022.
9. Y. Chen, J. Li, C. Liu, J. Han, Y. Zhang and P. Yi, "Efficient attribute based
server-aided verification signature", IEEE Trans. Services Comput., vol.
15, no. 6, pp. 3224-3232, Nov. 2022.
10. Y. Lin, J. Li, X. Jia and K. Ren, "Multiple-replica integrity auditing
schemes for cloud data storage", Concurrency Comput. Pract. Exper., vol.
33, no. 7, pp. 1, Apr. 2021.
REFERENCES
66