Last PCP Asi
Last PCP Asi
Ms Sumudu Samarakoon
Assessor Internal
Verifier
Unit: 6 Planning a Computing Project
Unit(s)
Planning a Project on the Big Data Technologies in achieving operational
efficiency
Assignment title
Student’s name
List which assessment Pass Merit Distinction
criteria the Assessor has
awarded.
INTERNAL VERIFIER CHECKLIST
Programme Leader
signature (if required) Date
Higher Nationals – Summative Assignment Feedback Form
Student Name/ID
LO1. Conduct small-scale research, information gathering and data collection to generate knowledge on an
identified subject
LO3. Produce project plans based on research of the chosen theme for an identified organisation
LO4. Present your project recommendations and justifications of decisions made, based on research of the
identified theme and sector
Pass, Merit & Distinction P6 P7 P8 M4 D2
Descripts
Resubmission Feedback:
Action Plan
Summative feedback
1. A Cover page or title page – You should always attach a title page to your assignment.
Use previous page as your cover sheet and make sure all the details are accurately filled.
2. Attach this brief as the first section of your assignment.
3. All the assignments should be prepared using a word processing software.
4. All the assignments should be printed on A4 sized papers. Use single side printing.
5. Allow 1” for top, bottom, right margins and 1.25” for the left margin of each page.
1. The font size should be 12 point and should be in the style of Time New Roman.
2. Use 1.5 line spacing. Left justify all paragraphs.
3. Ensure that all the headings are consistent in terms of the font size and font style.
4. Use footer function in the word processor to insert Your Name, Subject, Assignment
No, and Page Number on each page. This is useful if individual sheets become detached
for any reason.
5. Use word processing application spell check and grammar check function to help editing
your assignment.
Important Points:
1. It is strictly prohibited to use textboxes to add texts in the assignments, except for the
compulsory information. eg: Figures, tables of comparison etc. Adding text boxes in the
body except for the before mentioned compulsory information will result in rejection of
your work.
2. Carefully check the hand in date and the instructions given in the assignment. Late
submissions will not be accepted.
3. Ensure that you give yourself enough time to complete the assignment by the due date.
4. Excuses of any nature will not be accepted for failure to hand in the work on time.
5. You must take responsibility for managing your own time effectively.
6. If you are unable to hand in your assignment on time and have valid reasons such as
illness, you may apply (in writing) for an extension.
7. Failure to achieve at least PASS criteria will result in a REFERRAL grade.
8. Non-submission of work without valid reasons will lead to an automatic REFERRAL. You
will then be asked to complete an alternative assignment.
9. If you use other people’s work or ideas in your assignment, reference them properly
using HARVARD referencing system to avoid plagiarism. You have to provide both in-
text citation and a reference list.
10. If you are proven to be guilty of plagiarism or any academic misconduct, your grade
could be reduced to A REFERRAL or at worst you could be expelled from the course
Student Declaration
I hereby, declare that I know what plagiarism entails, namely to use another’s work and to
present it as my own without attributing the sources in the correct way. I further understand
what it means to copy another’s work.
Assignment Brief
Student Name /ID Number
Unit Tutor
Issue Date
Submission Date
Submission Format:
The submission should be in the form of an individual report with the following sections.
You are required to make use of headings, paragraphs, and subsections as appropriate, and
all work must be supported with research and referenced using Harvard referencing system.
Please provide in-text citation and a list of references using Harvard referencing system.
Please note that this is an individual assessment, and your report should include evidence to
that you have conducted a research to collect relevant data individually.
Unit Learning Outcomes:
LO1 Conduct small-scale research, information gathering and data collection to generate
knowledge on an identified subject
LO2 Explore the features and business requirements of organisations in an identified sector.
LO3 Produce project plans based on research of the chosen theme for an identified organisation
LO4 Present your project recommendations and justifications of decisions made, based on research
of the identified theme and sector
Assignment Brief and Guidance:
Research Topic: The impact of the application of Big Data Technologies in operational efficiency
“Big data is a term that has become more and more common over the last decade. It was originally
defined as data that is generated in incredibly large volumes, such as internet search queries, data
from weather sensors or information posted on social media. Today big data has also come to
represent large amounts of information generated from multiple sources that cannot be processed
in a conventional way and that cannot be processed by humans without some form of
computational intervention. Big data can be stored in several ways: Structured, whereby the data is
organised into some form of relational format, unstructured, where data is held as raw,
unorganised data prior to turning into a structured form, or semi-structured where the data will
have some key definitions or structural form, but is still held in a format that does not conform to
standard data storage models. Many systems and organisations now generate massive quantities of
big data on a daily basis, with some of this data being made publicly available to other systems for
analysis and processing. The generation of such large amounts of data has necessitated the
development of machine learning systems that can sift through the data to rapidly identify
patterns, to answer questions or to solve problems. As these new systems continue to be
developed and refined, a new discipline of data science analytics has evolved to help design, build
and test these new machine learning and artificial intelligence systems. Utilising Big Data requires a
range of knowledge and skills across a broad spectrum of areas and consequently opens
opportunities to organizations that were not previously accessible. The ability to store and process
large quantities of data from multiple sources has meant that organisations and businesses are able
to get a larger overall picture of the pattern of global trends in the data to allow them to make
more accurate and up to date decisions. Such data can be used to identify potential business risks
earlier and to make sure that costs are minimized without compromising on innovation. However,
the rapid application and use of Big Data has raised several concerns. The storage of such large
amounts of data means that security concerns need to be addressed in case the data is
compromised or altered in such a way to make the interpretation erroneous. In addition, the
ethical issues of the storage of personal data from multiple sources have yet to be addressed, as
well as any sustainability concerns in the energy requirements of large data warehouses and lakes”.
(Pearson, 2023)
Assignment Scenario
You are expected to carry out a small-scale research project in order to explore the “impact of the
application of Big Data Technologies in operational efficiency in a range of academic, scientific
and economic areas” from the standpoint of a computing professional or a data scientist. The
research that you carry out can be based on an organization / organization, a field, a case study, a
scenario, etc. that you have access to gather sufficient information to investigate the applications,
benefits and limitations of Big Data technologies.
The findings of the research should be presented in a professionally compiled report and the report
should cover the given tasks including,
A comprehensive project plan - including a work, time and resource allocation/ breakdown
using appropriate tools. A business area analysis Including the features and operational
areas of the business and the role of stakeholders and their impact on the success of the
business.
A research paper - including application and evaluation of quantitative and qualitative
research methods to generate relevant primary data and examination of secondary sources
to collect relevant secondary data and information.
An Action plan – including recommendations and evaluation of project outcomes
comparing the decisions given in the project plan.
TASK – 01 and 02 : Project Management Plan
Task 1
1.1. Select an organization/ organization, a field, a case study or a scenario of your choice that
allows you to explore and study the relevant data of the application of Big data technologies.
Plan a small-scale research project on the impact of the application of big data technologies in
operational efficiency.
Provide an introduction and background to your project and the chosen organization / field or
the scenario . Define the scope and devise aims /objectives of the project that you are going
to carry out. You also should include risks and benefits of exploring the impact of big data
technologies of the chosen organization/s or the field.
TASK – 02
Discuss the features and operational areas of the chosen organization/s , the role and the impact of
stakeholders for the success of the business. You also need analyse the challenges the
organization/s may face in achieving the success and meeting business objectives by applying big
data technologies to achieve operational efficiency.
Carry out the research to investigate the “impact of the application of big data technologies in
operational efficiency” and generate relevant primary data by applying appropriate qualitative and
quantitative research methods. You need to examine secondary sources to collect relevant
secondary data and information to support the research. You then need to analyse the data and
information and interpret the findings to generate knowledge on how the application of big data
technologies supports business requirements in the identified organization/s .
4.1 Communicate appropriate project recommendations derived from the research data
analyzed for technical and non-technical audiences and assess the extent to which
the project recommendations meet the needs of the chosen organization/s.
4.2 Discuss the reliability, accuracy, and the appropriateness of the research methods
applied while arguing and evaluating the planning recommendations made in the
project plan comparing them to the actual outcomes and the need of the chosen
organization/s.
ACKNOWLEDGEMENT
Table of Figure
Big data basically refers to large, intricate datasets that are mostly obtained from newly
developed data channels. These datasets are so large that they are difficult for traditional
data processing technologies to manage efficiently. Big data is defined as information that
is characterized by three key factors: its substantial volume, its velocity (the speed at
which it is created and gathered), and the variety of data points it covers (also known as
the "three v's" of big data). Big data is produced by data mining activities and can take
many different forms."Big data" is a self-explanatory word that refers to extremely large
datasets that are difficult to process using conventional computing methods. The phrase
encompasses more than simply the data itself; it also refers to the structures, instruments,
and methods necessary for managing it. For industry stakeholders, the continued
advancement of technology and the emergence of new channels of communication like
social networking pose challenges. They need to come up with creative solutions to deal
with the enormous volume and complexity of data produced in the current digital world.
(PRIYADHARSHANI, 2023)
Structured
Unstructured
Semi-structured
STRUCTURED
Highly organized information is kept in predetermined formats, such as relational
databases or spreadsheets, and is referred to as structured data. Every component adheres
to a particular data type and fits into predefined fields. Consistency and uniformity are
essential qualities that make querying and analysis simple. Computer science has
produced methods for handling structured data well, but as datasets become larger—up to
several zettabytes—problems start to appear.
UNSTRUCTURED
SEMI-STRUCTURED
Data that is partially structured combines elements of both structured and unstructured
data. It doesn't have a set schema, in contrast to properly structured data in typical
databases. Rather, it takes on a portion of the hierarchical structure found in XML and
JSON forms. Timestamped sensor data, log files, and metadata are a few examples. It
seems structured, but is flexible enough not to follow table definitions rigidly. For
example, information in an XML file is arranged hierarchically, but it is not constrained
by the strict framework of fully structured databases. Semi-structured data is ideal in
scenarios where formats may alter or differ because of its versatility.
FIGURE 1 BIG DATA (CEPERO, 2019)
A visionary entrepreneur set out to reimagine social connections at the dawn of the 21st
century, and he created a platform that has the potential to completely transform the
online interaction space. This was the beginning of the SocialSphere company's history,
when creative concepts were first planted, laying the groundwork for the company's
eventual development.
With an enthusiastic staff, the company set off on its innovative path, navigating the
constantly evolving digital terrain. Overcoming challenges, SocialSphere created a
feature-rich platform that includes multimedia sharing, real-time communication, and
immersive virtual experiences by utilizing cutting-edge technologies and user feedback.
As SocialSphere develops, its story will always be one of creativity, tenacity, and an
unrelenting dedication to providing a safe and engaging online environment for people all
over the world.
RISKS:
Privacy Concerns: Privacy is a major concern when big data technologies are integrated
into social media applications. Extensive user data gathering and analysis carry the risk of
security breaches or misuse, endangering user confidence and attracting regulatory notice.
Security Vulnerabilities : For SocialSphere, the expanding volume of data raises
security issues. Strong cybersecurity measures must be put in place to protect against data
breaches, unauthorized access, and other cyber threats endangering user information.
Ethical Issues: Big data's extensive use could lead to a number of ethical problems, such
as unintentional manipulation of user behavior, discriminatory behaviors, or biases in
algorithms. A harmonious balance between innovation and ethical issues must be struck.
Data Accuracy and Quality : The veracity of the underlying data is what determines
how effective big data is. Inaccurate or poor-quality data collection might result in
weakened analytics and poorly thought out business decisions.
BENEFITS:
SocialSphere Platform:
The main feature of SocialSphere is its robust social networking platform, which offers
users an easy-to-use and entertaining way to stay in touch with friends, post updates, and
engage with a range of material.
Profile Management:
Users of SocialSphere are able to create and manage customized profiles that dynamically
present their connections, hobbies, and activities in an interesting and eye-catching way.
News Feed:
A central feed within SocialSphere collects content from users' networks to provide a
constant stream of postings, updates, and multimedia content.
Friend Connections:
Within SocialSphere, users can connect with friends, family, and acquaintances to build a
network that forms the foundation for their social interactions.
Multimedia Sharing:
SocialSphere allows users to share a wide range of multimedia content, such as images,
videos, and status updates, effortlessly showcasing their creativity.
Messenger:
Through voice calls, video chats, and instant messaging, the SocialSphere Messenger
service enables users to communicate instantly, encouraging direct and in-the-moment
engagement.
SocialSphere facilitates the creation and participation in events and groups by offering
tools for users to interact through shared interests, pastimes, and activities.
Marketplace:
With the help of a marketplace function built into SocialSphere, users may conduct
transactions and create a virtual marketplace inside the social media network.
Business Pages:
Businesses and organizations can create dedicated pages on SocialSphere, which opens
up possibilities for community building, consumer interaction, and brand marketing.
Advertising Solutions:
SocialSphere provides companies with highly focused advertising options that make use
of data analytics to maximize ad placement and guarantee successful interaction with the
target market.
At SocialSphere, our group of dedicated and varied individuals is our most valuable
asset. When combined, they add to the life and success of our social media network. We
are committed to fostering an energetic and welcoming work environment, standing in
solidarity with sector pioneers like SocialSphere that place a premium on their human
capital.
Our HR strategy seeks to attract, retain, and develop elite talent in the rapidly evolving
technology sector. In addition to technical proficiency, the hiring process gives weight to
a candidate's commitment to innovation and delivering a satisfying user experience. We
provide an environment where workers are inspired to think creatively, push boundaries,
and share their unique perspectives in order to shape the direction of social networking.
Users: The core of SocialSphere's success is its user base. User involvement, loyalty,
and contentment have a direct impact on the platform's growth and popularity. To
guarantee long-term success, it is essential to understand customer expectations and
deliver a satisfying experience.
Developers and Content Creators: The app developers and content creators that
make up the SocialSphere platform's creative community play a major role in its
appeal. The implementation of novel ideas, compelling functionalities, and diverse
material is crucial for drawing in new users and keeping existing ones interested,
which in turn affects the company's overall performance.
One of the main challenges is navigating the complicated landscape of data privacy
legislation. A significant challenge is striking a careful balance between protecting user
privacy and using big data for insights, especially in light of new laws like GDPR.
Security Threats
Due to their massive user data, social media businesses are attractive targets for hackers.
Strong cybersecurity measures must be put in place to guard against data breaches and
unauthorized access.
Technological Advancements
It's difficult to keep users engaged and to maintain a user base in a cutthroat market. In
order to tailor services and content and guarantee user engagement, big data analytics
must be used to understand user behavior.
Manage Budgets and Timelines: Project managers are essential to businesses because
they negotiate budgets and schedules with teams, management, and important
stakeholders. Without a strong project manager, a project could run over budget and
experience delays.
Mitigate Project Risks: Any scenario that could have an impact on the project, whether
favorable or unfavorable, is considered a risk. A project manager uses a five-step process
to strategically create an organized plan for handling project risks.
About Project
Initiation:
Identify Stakeholders: Include internal teams, investors, and users in the process.
Planning:
Detail the project charter by outlining the project's extent, financial allocation, timeline,
and key stakeholders, including pertinent information on scope, budget, schedule, and
project participants.
Risk Management:
Execution:
Team Building: Assemble people with a variety of backgrounds and specialties into
groups.
Status Meetings: Call frequent meetings to plan, address challenges, and celebrate
accomplishments.
Communication Management:
Quality Assurance
Continuous Improvement:
Create an engaging and user-friendly social media platform that prioritizes user
requirements and preferences in order to foster a welcoming and upbeat online
community.
Prioritize scalability when creating the platform to ensure that it can easily accommodate
a growing user base. Boost efficiency to offer a dependable and responsive user
experience even at peak usage times.
By utilizing sophisticated algorithms and data analytics, you may tailor the way material
is delivered to viewers, giving them more engaging and interest-relevant feeds and
tailored suggestions.
Cross-Platform Integration:
Enable seamless integration with various digital platforms and technologies to expand the
platform's accessibility by allowing users to interact with SocialSphere on a variety of
devices and interfaces.
Platform Development:
Mobile Application:
Developing a mobile application for the iOS and Android operating systems in
order to increase SocialSphere's reach to a wider range of users.
Putting in place mechanisms for user feedback and making incremental changes
guided by user observations.
Promoting an innovative atmosphere within the development team in order to stay
on the cutting edge of developments in the industry.
Regulatory Compliance
Make a comprehensive project charter that outlines the main goals, the project's
scope, the stakeholders, and the justification for using big data to analyze social
media firms.
Create and record protocols that describe the steps involved in gathering data,
including the sources, techniques, and ethical issues. This ensures that relevant
data for the research study will be collected in an orderly and methodical manner.
Make a document detailing the technology used for analytics, processing, and data
storage, paying particular attention to important factors like efficiency and
scalability. Emphasize the reasoning behind each technological decision, making
sure the tools chosen are compatible with the project's specifications and flexible
enough to adjust as needed. The goal of this document is to give a concise
overview of the technological underpinnings, guaranteeing top performance and
adaptability in data management over the course of the project.
Assumptions
User Adoption:
Users are expected to adopt the SocialSphere platform with ease. Even when an interface
is designed with aesthetics and usability in mind, market conditions and competition can
affect how well-received it is by users.
Technology Stability:
The initiative is predicated on the idea of a stable technology landscape. However, the
planned development and implementation timescale may be impacted by unforeseen
technology developments or disruptions.
Market Trends:
The project's objectives are assumed to be in line with anticipated market trends.
Unexpected developments in the social media landscape or abrupt changes in user
preferences could have an impact on the project's success.
Constraints:
Budgetary Constraints:
The project operates within pre-established financial constraints. The scope or scale of
particular project components may be limited by any unanticipated increase in costs or
unanticipated expenses.
The project adheres to a set timeline. Unexpected occurrences, such technical problems or
outside influences, could affect how quickly project milestones are completed.
Technological Dependencies:
Limitations:
Data Accuracy:
While every effort will be made to ensure that the collected and analyzed data is accurate,
errors in user input or insufficient datasets may lead to problems with data accuracy.
Security Risks:
Although the project has put strong security measures in place, it is aware of the inherent
risks related to cybersecurity. Unforeseen security breaches or threats may reduce the
project's ability to protect user data effectively.
Risk identification
Increased data collecting and processing could put users' security and privacy at
risk, through things like unintended exposure of personal information, data
breaches, and illegal access..
Budget overruns may result from unforeseen costs associated with technology
purchases, training, and infrastructure upgrades.
Risk Management Plan with Risk Prevention mechanism and Risk Monitoring
Project Sponsor:
Project Manager:
Cross-Functional Teams:
Responsibilities: Creating and carrying out marketing plans, publicizing the project, and
making sure that people are effectively communicated with.
Advisory Board:
Project Sponsor
With ABC at the forefront and their innovative leadership in the social media business,
the initiative Sponsor plays a crucial role in the revolutionary SocialSphere initiative.
ABC, who has a strong background in brings a wealth of knowledge and strategic
insights to the project. In its capacity as Project Sponsor, ABC is committed to offering
the tools, direction, and assistance required to ensure the SocialSphere project is carried
out successfully. Maintaining a clear vision for SocialSphere's progress in the ever-
changing social media ecosystem and coordinating the project with the company's larger
goals are critical tasks for ABC's leadership. With ABC's extensive knowledge and
dedication, SocialSphere is well-positioned for growth and innovation in the cutthroat
world of social media.
Within the dynamic realm of SocialSphere, where new ideas flourish, the Project
Manager is essential to managing the smooth execution of programs that are similar to
those found in large social media companies such as Facebook. The Project Manager is
responsible for ensuring the success of the project and has a wide range of tasks to
Manager
Advisor
1.Initiation Phase
2.Planning Phase
3.Execution Phase
5.1 Documentation
1. Initiation Phase
Call a kickoff meeting to present the project team, establish ground rules, and discuss the
main objectives of the endeavor.
Define the research project's scope clearly, outlining the specific elements of social media
and big data to be examined.
Conduct a thorough analysis of the existing literature regarding the use of big data by
social media businesses, with a focus on initiatives that are similar to each other and
noteworthy findings.
Determine and group stakeholders, including users of social media platforms, data
analysts, sponsors, and government agencies.
Clearly state the objectives of the research by outlining the dimensions of social media
and big data usage that will be examined.
Allocate resources for data acquisition, tools, and research personnel. Ensure access to
relevant databases and sources.
Collect data from various social media platforms, considering privacy and ethical
considerations.
Utilize appropriate analytical tools to analyze big data sets, identifying patterns, trends,
and correlations.
Conduct case studies on specific social media companies to gain in-depth insights into
their big data utilization strategies.
Conduct interviews with industry experts, data scientists, and stakeholders. Implement
surveys to gather user perspectives.
Identify and address potential risks related to data integrity, ethical considerations, or
unexpected challenges in the research process.
5. Closing Phase
Compile and present research findings, highlighting key insights, trends, and implications
for social media companies utilizing big data.
5.2 Recommendations
Provide recommendations for social media companies based on the research findings,
focusing on potential improvements or innovations.
Prepare a comprehensive final report summarizing the entire research project, including
methodologies, results, and conclusions.
Successful project managers skillfully handle a wide range of difficulties, such as imprecise
goals, tight deadlines, inefficient resource allocation, and poor communication. They
frequently use instruments like the Milestone chart to address these problems and look for
workable solutions. Within the ever-changing field of project management, the Milestone
chart has become increasingly important. We shall investigate in detail the history,
fundamentals, and sophisticated methods related to Milestone Charts. Project managers can
use Milestone charts to improve project planning and execution by understanding these
aspects. Come along as we explore the benefits of using Milestone charts as a useful
approach to managing the complexities present in your projects. (KAGAN, 2023)
No matter the size of the project, a Gantt chart is an invaluable tool for project management
since it makes planning and scheduling much easier. It is very helpful because it provides a
picture of the tasks and schedule for projects. To show the numerous tasks, their sequence,
durations, and start and completion dates, the chart uses horizontal bars of varying lengths.
Project managers can easily monitor progress using this graphical representation, where each
bar indicates the task's completion state. This transparency improves the efficiency of project
management. Gantt charts are a vital tool for project managers to help ensure the effective
completion of projects because of its easy-to-understand format. (GRANT, 2023)
In the social media realm, creating a strong project control system is crucial due to the
extensive collection and analysis of user data. This method focuses on addressing privacy
and ethical concerns tied to big data use in social media, mirroring practices like those of
SocialSphere. The primary goal is to establish a framework that ensures responsible,
transparent, and accountable data management throughout the entire project lifecycle.
One of the most important components of the proposed project control system is the
establishment of clear instructions for the collection, processing, and use of data. Social
media companies can set precise boundaries on the kind of data they collect by creating
comprehensive policies that clearly outline their goals. This ensures that user data is treated
with the utmost care, following the guidelines of user autonomy and permission.
Quality Control
Quality control is the process of assessing and measuring goods and services in relation to
predetermined standards. Quality is a complex topic with many different interpretations. This
structured process enables companies to inspect, maintain, and enhance the caliber of their
products. Finding and fixing any deviations from the established quality benchmarks is the
main goal of quality control, which makes sure that the end product meets the expected levels
of excellence. Setting up exact controls is a crucial step in this process since it streamlines
production and provides a structure for handling quality issues. The possibility of mistakes is
reduced by clearly defining each employee's roles and duties, which strengthens the
organization's quality control system and makes it more reliable. (HAYES, 2023)
Inspection: checking products, materials, or services on a regular basis to look for defects,
infractions, or departures from defined standards of quality.
Statistical Process Control (SPC): Manufacturing operations are monitored and controlled
using statistical techniques to make sure they adhere to acceptable quality standards.
Documentation and Records: preserving thorough records of all testing, inspections, and
remedial actions taken in order to guarantee accountability and traceability.
Corrective Action: putting appropriate steps in place to address found quality problems and
prevent them from happening again.
Training and Education: supplying staff with the knowledge and skills they need to
effectively maintain quality standards.
Quality control
which includes methods to assess and improve product quality, is an essential part of
corporate operations. By comparing a process's operations to those of rivals or industry
benchmarks, benchmarking helps identify areas in need of improvement. Every stage of the
production process is evaluated to make sure quality standards are met. Product testing,
Quality Planning
To guarantee a successful trip, Quality Planning takes the lead in the project's early stages.
We begin by outlining specific project goals, with an emphasis on utilizing big data within
the context of social media. Diverse viewpoints are captured by identifying stakeholders,
who include developers, data scientists, end users, and decision makers. The next step is risk
assessment, which looks at data security, privacy, and compliance with regulations to
proactively resolve any possible problems. Resource management guarantees that the project
has the right people, equipment, and technologies to manage big data efficiently. Lastly,
when it comes to technology selection, we take great effort to pick scalable and effective
solutions for analytics, processing, and data storage. This all-encompassing strategy creates a
strong basis for the project's success.
Quality Assurance
To ensure trustworthy results, we place a high priority on quality assurance in each project.
The key component is data quality assurance, which involves taking steps to guarantee the
consistency, accuracy, and completeness of project data. In order to secure user privacy,
comply with data protection rules like GDPR, and avoid legal problems, compliance
assurance is equally crucial. The project's functionality is validated by the implementation of
rigorous testing, which encompasses unit, integration, and user acceptance testing. Data
Quality control
Our approach underscores quality control, implementing key tactics for effective
administration. Continuous monitoring tracks algorithm, data pipeline, and overall system
performance. We have an incident response protocol for swift resolution of data breaches and
security incidents. Key performance indicators (KPIs) within our metrics assess the
efficiency of our big data solution, ensuring a proactive and responsive approach to maintain
high standards in our projects.
Continuous Improvements
Peer Assessment
Regular code reviews and information sharing are two of the most important techniques we
prioritize in our development procedures. Code reviews preserve coding standards, find
issues early, and guarantee code excellence and consistency. Knowledge-sharing meetings
promote teamwork, problem-solving, and the sharing of expertise, all of which support
ongoing learning. Our development environment is strengthened and improved by these
techniques taken together.
Prior to project deployment, acceptance criteria define success benchmarks and specify
important rules. The standards they provide must be fulfilled in order for the project to be
successful. End users are actively involved in verifying if the technology lives up to their
expectations through user acceptance testing. This guarantees that the finished product meets
real-world needs and complies with technical requirements, providing a comprehensive
assessment prior to implementation.
Quality Checklists
Quality Preparation
Project success hinges on meticulous planning, emphasizing two crucial components. Firstly,
team members undergo structured training and skill development to enhance their
competencies and stay updated on emerging technologies. This ensures their capability in
overcoming project obstacles. Secondly, a commitment to Documentation Readiness stresses
the importance of maintaining accurate and comprehensive project documentation for future
reference. This dedication to transparency and a productive process significantly enhances
the overall quality of the project.
Social media now plays a crucial role in our everyday lives by promoting social interaction,
communication, and information exchange. With the introduction of SocialSphere, a
captivating story is told about how the combination of cutting-edge technologies and large
datasets can improve the social media environment. By examining the complex interplay
between big data processing, machine learning algorithms, and predictive analytics, this
study aims to reveal the fundamental architecture of SocialSphere.
All things considered, the SocialSphere inquiry provides insightful information on the
dynamics of social media in the future, showing us how big data technologies are shaping
and reshaping our online interactions, information consumption, and the digital landscape as
a whole. Maintaining awareness of the dynamic nature of social media and its possible
effects on individuals, companies, and society at large requires an appreciation of the
significance of this research.
Analyze the framework and methodology that SocialSphere uses to carry out significant,
large-scale data analysis. Understanding the variety of tools, technologies, and frameworks
used to handle and interpret large amounts of data from social media sites is necessary for
this.
Discover how SocialSphere collects and aggregates data from various social media channels.
Evaluate the tactics used to preserve data accuracy, relevance, and compliance with privacy
laws.
Examine the data processing techniques that SocialSphere employs to glean insightful
information from large datasets. This includes understanding different analytical techniques,
machine learning models, and algorithms.
Examine how SocialSphere enhances user involvement through the utilization of massive
datasets. Examine how they leverage data from users to personalize their experiences,
improve the way material is delivered, and increase user satisfaction in general.
Analyze SocialSphere's approach to privacy and morality when handling huge datasets.
Analyze the security measures put in place to protect user data, ensure compliance with data
protection laws, and maintain moral principles when using data.
Examine SocialSphere's viewpoint on impending big data and social media trends. Assess
their readiness to adopt new technology, adapt to changes in user behavior, and change with
the times in terms of industry standards.
Examine the challenges SocialSphere faces in utilizing big data and identify areas that could
use improvement or innovation. This investigation can include issues with scalability,
advancements in technology, or new trends in big data analytics.
Constraints
Tight laws pertaining to data privacy may limit researchers' access to and ability to examine
user data. The amount of data analysis that is feasible may be restricted in order to comply
with regulations such as GDPR and ethical issues.
The limits of the available sample size may limit the scope of the research, especially when
trying to gather specific user comments or thoughts about SocialSphere versus Facebook.
Limitations
Like any other private company, SocialSphere is prohibited from disclosing specific
proprietary information about its vast data architecture, algorithms, or unique data processing
techniques. This restricted access could make it difficult to fully understand the technological
procedures that the organization uses.
Press releases, official channels, and media coverage are some of the ways that information
about SocialSphere can be shared. This data may be skewed, highlighting advantages and
downplaying disadvantages. Getting an impartial opinion could be difficult.
While examining customer satisfaction and feedback is important, verifying the accuracy and
sincerity of user reviews or testimonials on your own can be challenging. Bias in self-
reported experiences may have an impact on the accuracy of this data.
Social media algorithms are often complex and proprietary, especially when they use large
amounts of data. The research may not provide a thorough understanding of all the subtle
aspects of SocialSphere's algorithms, which would limit our understanding of the factors that
influence user interactions and content recommendations.
SocialSphere can operate in contexts with different legal and regulatory frameworks.
Following these rules may limit the investigation of specific facets of the business's big data
operations.
Literature Review
Concept
The goal of the research into SocialSphere, a big data-driven social media company, is to
examine the complex environment of the platform and make analogies with Facebook, the
market leader. This study's main goal is to comprehend how SocialSphere uses big data to
design its features, improve user experiences, and create overarching business plans. In order
to uncover the nuances of the platform's data-driven decision-making process, the research
will examine the platform's underlying infrastructure, data collection techniques, and
analytical approaches.The research will also explore privacy concerns, user engagement
tactics, and the effects of big data on SocialSphere's economic model. We'll compare and
contrast these features with Facebook's. This study's major objective is to shed light on
SocialSphere's operational nuances while also offering insightful analysis of the larger
market of social media firms that use big data to innovate and improve user experience.
This study examines SocialSphere, a data-intensive social media company that innovatively
merges social media and advanced data analytics. It emphasizes the intentional integration of
extensive datasets from various platforms to extract valuable insights and improve user
experiences. Similar to Facebook, SocialSphere utilizes cutting-edge big data infrastructure,
machine learning models, and algorithms to stay at the forefront of the evolving industry.
The central theme underscores the pivotal role of data-driven decision-making in user
engagement, content personalization, and overall platform performance. The research delves
into how SocialSphere navigates the opportunities and challenges of big data in the evolving
landscape of social media, analyzing its impact on user satisfaction, company strategy, and
adherence to privacy and ethical norms. This study contributes to our understanding of how
big data will shape the future trajectory and competitive position of social media companies
in the digital ecosystem.
Benefits
Challengers
Research Approach
Qualitative research
One-on-One Interviews
Individual interviews remain one of the most fundamental and popular methods for
qualitative research. Open-ended and closed-ended questions are used in these interviews,
while open-ended discussions on particular subjects between researchers and participants are
frequently preferred. With the use of this communication method, participants' perspectives
can be explored more deeply, producing rich qualitative data that can be used for study. The
ability to gain detailed insights into people's thoughts and motivations is a noteworthy benefit
of individual interviews.Competent researchers who know how to ask the right questions can
steer these discussions in order to elicit insightful and complex findings. Additionally,
Focus groups
One popular method for effective data collection in qualitative research is the focus group.
Focus groups consist of a small number of participants, usually between six and ten people
who fit the target demographic. Their main objective is to investigate the "what," "why," and
"how" aspects associated with a certain topic. One important benefit is that modern
approaches enable remote participation via online questionnaires that can be completed on
different devices. This strategy is made more appealing by the ease with which responses
may be gathered with only one click. But it's important to understand that focus groups can
be more expensive to use than other online qualitative research techniques. Their cost is
justified by their exceptional value in clarifying intricate procedures, which makes them the
go-to option for market research projects, especially when exploring new product launches
and testing creative ideas.
Ethnographic research
case study
A case study provides a thorough understanding of the topic under research through a
rigorous investigation and analysis of a particular incident. Its main objective is to place the
subject in the context of the actual world, regardless of whether it is focused on a particular
organization, belief system, occasion, person, or action. This research method is widely used
in a variety of sectors, including education and the social sciences. Despite its seeming
complexity, it involves a careful and rigorous investigation of data gathering methodologies
as well as a nuanced analysis of the material obtained.Case studies are very useful in
marketing and sales because they highlight the observable benefits of a business's goods or
services. They also explore the intricacies of a certain circumstance. This procedure adds
significant insights to the pertinent field of study and enables a sophisticated comprehension
of the underlying variables.
Observation
Quantitative research.
Survey Research
Interviews
Speaking with important social media firm stakeholders provides insightful information
about how they handle large amounts of data. Talks with executives, data scientists, and
legislators are included in this. It is useful to ask open-ended questions to learn about their
decision-making processes, reasons, and obstacles.
Focus Groups
Organise focus groups with users, privacy advocates, and industry professionals to find out
what people think about how big data is used by social media businesses. This method can
highlight both areas of concern and support and reveal a range of viewpoints.
Content Analysis
Analyzing the company's official stance on big data use can be gained by looking closely at
its public statements, policies, and communications. This qualitative method helps to
understand how the business communicates its operating procedures to the general audience.
Ethnographic Studies
Surveys
Make surveys and distribute them to a broad user base to collect quantitative information
about their opinions on privacy, trust, and big data usage satisfaction. To properly gauge
attitudes and opinions, use Likert scale questions.
When permitted, obtain usage statistics and analytics that are available to the public from the
social media platform in order to track trends, patterns, and user behavior. This makes it
possible to quantify tangible measures and provide information about the scope and effect of
big data applications.
Statistical Analysis
The analysis of big data use by major social media companies like Facebook depends a lot on
population and sample size. The population size, which includes all platform users, is so
It is crucial to consider the diversity of Facebook's user base when choosing an appropriate
sample size. This includes taking into account variables such as age, geography, hobbies, and
behavior. The sample size used must reflect this variability in order to produce relevant and
useful results that are applicable to a larger population.
It is important to recognize how sample size affects validity and reliability in research. More
sample size usually means more precise results, but it also means more prices and more
logistical work. Achieving statistical significance while preserving practicality requires
striking a balance.
Dealing with the large and complex datasets created by social media users presents
opportunities as well as challenges in the field of big data. Advanced technology and
approaches are necessary to obtain meaningful insights. For manageable study and relevant
conclusions, it is essential to carefully pick a relevant sample from this large data pool.
Type of Data
Primary Data
Data that has been generated by the researcher himself/herself, surveys, interviews,
experiments, specially designed for understanding and solving the research problem at
hand.The primary data is also called raw data which is collected first hand by the researchers.
Often, primary data sources are selected and customized especially to fulfill the
demands or specifications of a given research endeavor. Determining the target
population and the aim of the study is crucial before choosing a data collection source.
Secondary Data
It's possible that a researcher first gathered data for a particular goal before making it
available for use by other researchers. Data may have been collected for general
purposes without a specific study target in mind, similar to the national census.
Nature of Data These exist in the format of These have reached their
unprocessed materials. completed state.
Surveys and structured questionnaires are useful tools used by researchers to gather
information from individuals or groups. These tools are made up of carefully constructed
questions meant to elicit particular data and insights. Data collection can be made flexible by
using a variety of channels for conducting surveys, including in-person interviews, phone
calls, postal forms, and online platforms. Conversely, observation involves information
collecting without asking questions. Subjectivity is introduced by this method since the
observer or researcher interprets the data based on their own judgment. Nonetheless,
Interviews
Observations
Researchers use observation as a study technique, paying close attention to and documenting
events, behaviors, and activities as they occur in their natural environments. This approach is
very helpful in getting real information about human behavior, interactions, or occurrences
without the influence of direct interference or direct questions. Unlike surveys or interviews,
observation involves silently observing persons in order to obtain information. While this
method adds subjectivity because the researcher's opinion is vital to the interpretation of the
data, there are circumstances in which bias is unlikely. Through observation, researchers can
get unique insights on unfiltered, real-world behaviors and occurrences by immersing
themselves in the natural environment.
In experimental research, factors are intentionally changed in order to assess how they affect
the results. In these investigations, investigators exercise control over circumstances, altering
factors to track causal relationships. Changing variables on purpose to see how they affect
the final result is the basic idea behind experimental research. In order to determine the
causal relationship between altered variables and observed outcomes, scientists meticulously
monitor the experimental setup, gather data, and draw conclusions. This methodology
enables a systematic research of correlations, resulting in a deeper understanding of the
variables influencing the subject of study.
Interviews
Interviews are a data collection technique in which one person asks another member of the
target audience a series of questions. This one-on-one conversation can take place over the
phone or in person. The responses gathered are documented and evaluated to gain insights
into the interviewee's behavior, preferences, and experiences. Interviews have the advantage
of yielding high-quality data since respondents are less likely to supply false information
when speaking with someone face-to-face. Nevertheless, a disadvantage of this approach to
gathering data is that it is not feasible for extensive projects because of the time limits
involved in conducting one-on-one interviews. Interviews work especially well when
discussing complex or delicate topics.
Observation
Presenting the problem statement to the intended audience without any form of direct
moderation is how observation methods collect data. Respondents consider and answer
questions during this process, and their non-verbal indicators—body language, mannerisms,
facial expressions, and voice tone—are closely watched. These nonverbal cues offer
insightful information for making decisions.
Usage Data
With the growing prevalence of technology, information is collected at every stage of the
process, from production to distribution. The gathered information is a useful tool for
Focus Groups
Focus groups are made up of people from various backgrounds who participate in
conversations led by a moderator. Participants from various backgrounds contribute a variety
of perspectives, which promotes the investigation of many points of view. It is thought that
this diversity of perspectives encourages creativity. Stakeholders can obtain a variety of data
and quickly verify facts through focus groups. The existence of an extremely powerful
person in the group, however, could be detrimental and jeopardize the accuracy of the data. It
is essential that the moderator balances group dynamics and moderates talks with expertise.
Focus groups are very useful for running beta tests for newly released goods.
Modern businesses rely heavily on analytical tools because they offer critical insights from
large-scale databases. Because every firm has different needs, selecting the right tool can be a
difficult task. These tools, which include applications and software, let experts analyze data
collections. Their primary goal is to provide thorough insights and useful information for
predictions, insightful decision-making, and insightful understandings. To effectively utilize
the potential of these tools, companies must understand the factors impacting tool selection
as they navigate the complex world of data analytics.
Tableau is a powerful business intelligence and data visualization solution that improves
reporting and analysis of large-scale datasets. Tableau is a software platform that was
founded in 2003 and later acquired by Salesforce in June 2019. It allows users to generate a
variety of charts, graphs, maps, dashboards, and stories, which helps users comprehend data
and make informed business decisions. Tableau lets customers explore datasets, get insights,
and find new possibilities through interactive dashboards and live visual analytics. Moreover,
the platform may be used to create interactive maps and carry out in-depth analysis across
KNIME is unique in that it provides a one-stop shop for the whole data science process,
from model creation to deployment and insight sharing. The free and open-source KNIME
Analytics Platform integrates with well-known machine learning frameworks and has over
300 interfaces to diverse data sources, keeping users on the cutting edge of data science. Its
drag-and-drop interface, which lets users visualize the results and enter data from many
sources, making it simple to create complex machine learning algorithms. Furthermore, the
KNIME Business Hub offers a single location for teamwork and the implementation of
solutions, allowing data workers to effectively communicate, exchange knowledge, and
improve the abilities of their associates. Overall, KNIME accelerates the time to insight,
increasing the relevance of data science. (VENDOR, 2022)
Microsoft's spreadsheet application Excel is a vital part of the Office business application
suite. Users can effectively format, arrange, and compute data on a spreadsheet using Excel.
The grid structure of the software is made up of cells that hold data and are arranged in rows
and columns. Excel makes information easy to organize and display as it is added to or
amended. Excel is especially helpful for simple data analysis since it lets you filter, arrange,
and show numerical data. With a variety of filters and algorithms, it facilitates the
construction of pivot tables, charts, and graphs. However, because Excel is not designed to
handle enormous amounts of data, users may find themselves in need of more capable tools
when working with large datasets. (VENDOR, 2022)
Apache Spark
Ethical considerations are crucial in guiding discussions regarding the activities of big data-
driven social media companies such as SocialSphere. Because these platforms collect a lot of
personal data about users in order to improve user experiences and provide targeted
advertising, protecting user privacy is a big ethical concern. It becomes essential to find the
ideal balance between preventing intrusion and allowing personalization. Managing the
ethical aspects of data security, stopping illegal access, and making sure that data isn't
misused are essential.Keeping data collection and usage policies transparent is a crucial
ethical priority. Social media businesses need to make sure users have significant control
over their privacy settings and are educated about how their data is used. Preventing biases
and prejudice in algorithmic decision-making processes requires maintaining fairness. In the
end, these businesses have an obligation to foster a digital environment in the constantly
changing big data use landscape that upholds ethical standards, respects users' rights, and
encourages openness.
Roles of a Researchers
Researchers are essential to the research process because they collect, organize, and verify
important data related to a certain topic. Their responsibilities include data analysis, resource
comparison, fact checking, and sharing findings with the research team. Researchers respect
the privacy of sensitive data, adhere to established procedures, and conduct fieldwork as
necessary. To make sure that the goals of the research are aligned, they must remain current
on market trends. They make use of a variety of techniques, such as desktop research, and
they consult materials such as books, journal articles, newspapers, surveys, questionnaires,
and interviews. Written notes and pertinent software are used to record patterns and trends,
Problem Formulator
When it comes to identifying and defining research issues, researchers are essential. They
must carefully assess the gaps in current knowledge, provide relevant recommendations, and
clearly state the goals that direct the course of the research.
The task assigned to researchers is to collect and analyze data in order to investigate their
research questions. This includes choosing appropriate research designs, gathering
information via surveys, experiments, or direct observation, and using statistical or
qualitative methods to draw important conclusions.
Knowledge Integrator
Academics often need to combine data from several sources in order to develop a
comprehensive understanding of their research topic. This means reading relevant literature,
digesting previous findings, and placing their own research in the context of a larger body of
knowledge.
Innovator
Communicator
For researchers to share their findings with the academic community and the general public,
effective communication is essential. This duty entails presenting research findings in
scholarly publications, conferences, and other venues in addition to breaking complex ideas
down into language that a range of audiences can grasp.
CHAPTER 4
2. What methods are employed to gather user data, and which sources contribute to the
company's data pool?
3. Which famous platforms and technologies does the business utilize to process and
analyze large amounts of data?
5. In what ways does the organization apply big data to improve customer experience
and engagement?
6. What measures are in place to guard against user data being misused?
7. Which upcoming developments in big data and analytics is the company closely
observing or intending to utilize?
Feedback Form
One of the most important parts of any research project is the analysis of respondent
demographics, which entails gathering and analyzing data about the traits of research
participants. This comprehensive analysis sheds light on how various demographic factors,
including gender, age, education, employment status, geographic location, technology usage,
ethnicity/race, income level, and social media habits, may affect perceptions, behaviors, and
responses. It also helps researchers gain insights into the diverse backgrounds and contexts of
respondents. Researchers can detect patterns, trends, and correlations in their data by closely
examining these demographic characteristics, which can help them interpret their findings
more nuancedly. In addition to adding value to the study, a well-executed examination of
According to the feedback 50% of responders are others. And the equal amount of
responders are Directors and administrative staffs. It is 25% .
According to equl numbers of responses join this company 1 year and 5 or more years .25
% of responders are have been associated with company nearly 1 - 4 years. It means most of
responders have experience about the company.
When we consider overall responses, we can decide most of the time big data is used in
company for dicision marketing support. Most of the people think the big data is using in the
company for dicision marketing support. No one thinks that it improves operational
efficiency. Equal amount of them think big data can be used in the company for user
profiling and targeting and enhancing user experience.
Most people think misusing data can be guarded by robust encryption and secure storage
practices. 37.5% of people think regular audits and compliance checks guard against user
data being misused. Neither of strict access controls and permissions or ongoing user
education on data privacy are used to guard against data misuse.
People responded mostly to cloud based analytics which the development in big data and
analytics is going to use in the company. No one thinks machin learning and predictive
analytics are causing any development in big data and analytics. 12.5% of people who
responded thinks that real time data processing indenting to utilize in the company .
Task 4
Action Plan
1 . Issue: Social media platforms gather extensive user data, prompting worries regarding the
invasion of privacy.
Resolution:To protect user information, businesses should give top priority to clear
data usage policies, get express agreement from users, and put strong security
measures in place.
2 . Issue: The misuse of big data can manifest in targeted advertising, manipulation, and the
influencing of user behavior.
3 . Issue: Social media algorithms have the potential to be biased and produce discriminating
results.
Technical Audience
1 . Issue: Social media firms frequently encounter criticism due to mishandling user data,
resulting in heightened privacy apprehensions and security breaches.
2 . Issue: Social media companies have the difficulty of growing their infrastructure to
accommodate enormous volumes of data as user populations increase rapidly.
Resolution: use cloud services, deploy effective data partitioning techniques, and use
scalable and distributed computing frameworks.
Resolution: Utilize stream processing frameworks such as Apache Kafka and Apache
Flink to effectively manage real-time data processing.
4 . Issue: Big data usage in social media may unintentionally result in biased algorithms and
immoral AI procedures.
Managing actual spending in relation to the budget on a regular basis is a critical component
of good project management. If discrepancies appear, it's critical to carry out a
comprehensive investigation to pinpoint the underlying causes and modify the budget as
needed. To evaluate and approve any changes that might have an influence on the budget and
make sure that pertinent stakeholders are notified before modifications are made, a strong
change control procedure must be put in place. To guarantee that project costs remain within
the authorized expenditure cap, strict cost controls and monitoring measures must also be put
in place. This include establishing spending caps, getting estimates for big expenses, and
keeping a close eye on project expenses. Through the course of the project, project managers
can actively manage financial resources, reduce potential risks, and encourage greater
financial responsibility by implementing these best practices.
Ensure that any adjustments to the budget or other changes to the original Project
Management Plan are accurately recorded in the revised records. Maintaining accurate
records of budget modifications is essential for maintaining openness and giving a clear
picture of the project's financial situation. Furthermore, a thorough analysis of the critical
route can help prioritize and identify tasks that are essential to the project's completion date.
Reducing overall project setbacks requires prompt resolution of any delays in critical
operations. Consider putting resource leveling solutions into practice if project timeline
limitations are present in order to guarantee a balanced workload distribution and maximize
resource allocation for crucial tasks. By permitting timely interventions to maintain project
schedules and budgetary objectives and providing a more accurate assessment of the project's
progress, these proactive approaches help to good project management.
Recommendation
Technical Audience
Prioritize putting end-to-end encryption into place, making sure data is transmitted
securely, and carrying out frequent security audits to safeguard user data.
Ensure open lines of communication and give users authority over their data. Be
motivated by SocialSphere's commitment to user privacy protection.
Strive for a seamless experience for users on all platforms, taking cues from
SocialSphere's excellent combination of Instagram and WhatsApp.
Benefits to Company
Benefits to Industry
The industry as a whole has benefited greatly from the widespread use of big data analytics
in the social media space, as demonstrated by sites like SocialSphere. First off, businesses
may obtain deep insights into user behaviors, preferences, and trends through the strategic
application of big data, which promotes a customized and interesting user experience. In
addition to raising customer satisfaction, this enhanced personalization helps to strengthen
platform loyalty and user retention. Big data also enables social media companies to improve
their advertising methods, precisely targeting particular demographics and optimizing the
success of marketing campaigns. Enhanced data-driven decision-making, fueled by advanced
analytics, makes it possible to respond quickly to shifting market demands. As a result, there
is a boom in innovation within the sector, with businesses consistently launching new
features and enhancing existing ones. Ultimately, the incorporation of big data analytics into
the social media space serves as a driving force for advancement in the sector, resulting in
more resilient platforms, more user contentment, and long-term commercial expansion.
Interviews
Accuracy
Accurate interviews are essential for gathering trustworthy data and arriving at wise
judgments, especially in industries like hiring, research, and media. For the creation
of high-performing teams during employment interviews, proper assessment of
talent and cultural fit is crucial. In both study and journalism, accurate questioning
and focused listening during interviews are critical to the validity of material.
Consistency is encouraged by the use of structured interview techniques, and a
promising path toward objective evaluation is provided by the possible integration
of AI and data analytics. To put it simply, the reliability of interviews is a critical
component in attaining favorable results in a variety of professional and research
settings.
Reliability
The reliability and consistency of the data collected during interviews determine
how effective they are as a research approach. Interviews can yield trustworthy data
when conducted using a systematic and uniform methodology, particularly in
qualitative research. Reliability is increased by keeping the interview method,
interviewee selection, and question formulation consistent. To further add to the
reliability of the information obtained, the interviewer's skill in establishing rapport,
asking open-ended questions, and responding to lively discussions is crucial.
However, it is important to understand that variables like respondent variability,
interviewer bias, and interview setting might affect interview reliability. To
Questionnaires
Accuracy
Maintaining the validity and reliability of the data collected requires making sure
that questionnaire accuracy is up to par. In order to collect data for a variety of
purposes, such as market analysis, research, and academics, questionnaires are
essential. A well-designed questionnaire should avoid bias, utilize simple language,
and be in line with the study objectives in order to elicit accurate replies. The
accuracy of data collection is increased by closely monitoring the phrasing,
structure, and formatting of the questions. Before a product is widely distributed,
pilot testing also aids in identifying and resolving any potential ambiguities or
misunderstandings. Researchers and organizations can securely gain useful insights
from collected data by prioritizing accuracy in both questionnaire design and
implementation. This will strengthen the overall integrity of their results and
decision-making processes.
Reliability
Reliability of the questionnaire is essential for reliable and accurate data gathering.
The key is reliability, which reflects consistency in responses across time and
among respondents. Reliability is established using methods such as test-retest and
inter-rater reliability and metrics such as Cronbach's alpha. Good reliability
indicates consistent measurement of target parameters, which increases confidence
in research conclusions. To preserve reliability and provide confidence in study
outcomes, researchers must take biases into account when designing questionnaires.
Questionnaires
To gather trustworthy and significant data, questionnaires must be valid, dependable, and
accurate. Carefully crafting questions that are properly aligned with study objectives is
crucial to enhancing validity as it helps to prevent ambiguity and potential misinterpretations.
A pilot study with a limited sample size facilitates the identification of any clarity or
relevance-related problems. Ensuring uniform interpretation and response from participants
is achieved by keeping consistency in language and formatting across all questions, hence
strengthening reliability. Reliability can also be increased by using verified survey questions
and defined measurement scales. Using clear language and avoiding biased or leading
questions improves accuracy. Questionnaires can be kept current and useful throughout time
Future Suggestions
To reduce latency and enhance real-time interactions, handle data close to its source
by utilizing edge computing technology. This ensures a faster and more flexible user
experience, which is a key factor in SocialSphere’s success.
Limitations
Social media companies routinely face criticism as a result of privacy issues brought
on by their massive data collection and user tracking policies.
Biases from the training data may be incorporated into the big data analysis
algorithms, which could result in the production of discriminating outcomes.
Due to the vast amounts of user data they possess, social media companies are
attractive targets for cyberattacks and possible data breaches.
Metrics like click-through rates and platform usage duration may not accurately
reflect user pleasure or the general caliber of user interactions if they are prioritized.
Cybercriminals can carry out social engineering attacks, modify data, and spread
misleading narratives by taking advantage of flaws in big data platforms.
Annexures/ Appendixes
Project Charter
Project Charter
Project Scope The project focuses on evaluating big data usage in social media,
especially tactics similar to SocialSphere’s . It aims to identify best
practices in user engagement, security, and data analytics. The brief
study covers topics such as user behavior analysis, cross-platform
integration, real-time processing, advanced analytics, data security,
A/B testing, and ethical data practices. Drawing inspiration from
SocialSphere, the project seeks to extract valuable lessons to shape
market trends and establish best practices for social media
organizations using big data.
Deliverables
A thorough research report will be the outcome of the study project
on the impact of big data technologies on SocialSphere's business
operational efficiency. As one of the main deliverables, this report
Assumptions /
Dependencies
The success of the study project is predicated on the assumption that
Big Data technologies must be seamlessly integrated into
SocialSphere's operational procedures. Important elements include
the project's reliance on precise and thorough data about the
business's operations as well as the effective adoption of Big Data
technology by pertinent SocialSphere stakeholders.
Manger – H.M.I.S.B.Hennayake
D.H.D. Rathnayke
H.KP.Hettiarchi
2. What methods are employed to gather user data, and which sources contribute to the
company's data pool?
4. How does the business use the data it has acquired to generate user profiles?
5. In what ways does the organization apply big data to improve customer experience
and engagement?
6. What measures are in place to guard against user data being misused?
7. Which upcoming developments in big data and analytics is the company closely
observing or intending to utilize?
LO1 Conduct small-scale research, information gathering and data collection to generate
knowledge on an identified subject
P2 Examine secondary sources to collect relevant secondary data and information for an
identified theme.
M1 Analyse data and information from primary and secondary sources to generate knowledge
on an identified theme.
D1 Interpret findings to generate knowledge on how the research theme supports business
requirements in the identified sector.
LO2 Explore the features and business requirements of organisations in an identified sector
LO3 Produce project plans based on research of the chosen theme for an identified
organisation
P5 Devise comprehensive project plans for a chosen scenario, including a work and resource
allocation breakdown using appropriate tools.
M3 Produce comprehensive project plans that effectively consider aims, objectives and
risks/benefits for an identified organization.
LO4 Present your project recommendations and justifications of decisions made, based on
research of the identified theme and sector
P7 Present arguments for the planning decisions made when developing the project plans.
D2 Evaluate the project planning recommendations made in relation to the needs of the
identified organisation and the accuracy and reliability of the research carried out.