0% found this document useful (0 votes)
57 views63 pages

Research

The document discusses research methodologies and approaches. It examines primary and secondary research methods and how to choose the appropriate methods. Some examples of primary research methods include surveys, interviews and focus groups. Secondary research involves using existing information from sources like published materials. The document also discusses conducting research, analyzing qualitative and quantitative data, and considering costs, access and ethical issues.

Uploaded by

Milan Bhantana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views63 pages

Research

The document discusses research methodologies and approaches. It examines primary and secondary research methods and how to choose the appropriate methods. Some examples of primary research methods include surveys, interviews and focus groups. Secondary research involves using existing information from sources like published materials. The document also discusses conducting research, analyzing qualitative and quantitative data, and considering costs, access and ethical issues.

Uploaded by

Milan Bhantana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 63

Contents

LO1: Examine appropriate research methodologies and approaches as part of the research process
...............................................................................................................................................................5
Introduction:.....................................................................................................................................5
Produce a research proposal that clearly defines a research question or hypothesis supported by
a literature review.............................................................................................................................5
Research proposal.............................................................................................................................5
Abstract:............................................................................................................................................5
Research introduction:......................................................................................................................6
Reason for choosing this research project........................................................................................8
Literature Review of AI in natural Language processing:.............................................................10
Activities and Timescales of Smart home project proceed:............................................................12
Research approach and Methodology:...........................................................................................13
Research approaches implement in AI digital marketing:.............................................................13
 Primary Research..................................................................................................................13
 Secondary Research..............................................................................................................13
1)Qualitative Research Method....................................................................................................13
2)Quantitative Research Method..................................................................................................13
Examine appropriate research methods and approaches to primary and secondary research.....15
Research methods type:..................................................................................................................15
1. Primary Research:.................................................................................................................15
Types of Primary Research Methods..............................................................................................16
Observation..................................................................................................................................16
Quantitative:................................................................................................................................17
Focus groups:................................................................................................................................17
Interview......................................................................................................................................17
Surveys.........................................................................................................................................18
Primary Research Advantages and Disadvantages........................................................................19
Secondary research method:...........................................................................................................20
Types of Secondary Research Methods..........................................................................................20
 Public sources:......................................................................................................................20
 Commercial sources:.............................................................................................................20
 Educational institutions:.......................................................................................................20
Advantages and disadvantages of secondary research:................................................................20
Advantages..................................................................................................................................20
Disadvantages..............................................................................................................................20
Pros And Cons Of Primary Vs Secondary Research......................................................................21
Difference between primary and secondary research:...................................................................22
Conclusion:......................................................................................................................................23
LO2: Conduct and analyze research relevant for a computing research project..............................25
Introduction....................................................................................................................................25
Conduct primary and secondary research using appropriate methods for a computing research
project that consider costs, access and ethical issues......................................................................25
Conducting Secondary Research....................................................................................................25
Conducting Primary Research........................................................................................................25
Step to Choosing sources by conducting best appropriate method:..............................................26
Step 1: Identify and develop your topic........................................................................................26
Step 2 : Do a preliminary search for information..........................................................................26
Step 3: Locate materials................................................................................................................27
Step 4: Evaluate your sources.......................................................................................................27
Step 5: Make notes.......................................................................................................................27
Step 6: Write your paper..............................................................................................................28
Step 7: Cite your sources properly................................................................................................28
Step 8: Proofread..........................................................................................................................28
Cost for conducting research:.........................................................................................................28
Ways we Can Reduce Research Costs:...........................................................................................30
Major ethical issues in conducting research...................................................................................32
Ranking:........................................................................................................................................32
Personalization:............................................................................................................................32
Fraud Prevention:.........................................................................................................................32
Accessibility:.................................................................................................................................32
Issue in conducting research:..........................................................................................................33
Apply appropriate analytical tools analyze research findings and data........................................34
Data analysis:..................................................................................................................................34
Why analyze data in research?.....................................................................................................34
Types of data in research................................................................................................................35
 Qualitative data:...................................................................................................................35
 Quantitative data:.................................................................................................................35
Data analysis in qualitative research..............................................................................................36
Finding patterns in the qualitative data........................................................................................36
Methods used for data analysis in qualitative research................................................................37
Data analysis in quantitative research............................................................................................38
Phase I: Data Validation................................................................................................................38
Phase II: Data Editing....................................................................................................................38
Phase III: Data Coding...................................................................................................................38
Methods used for data analysis in quantitative research..............................................................39
Considerations in research data analysis......................................................................................41
Conclusion:......................................................................................................................................42
LO3: Communicate the outcomes of a research project to identified stakeholders..........................43
Introduction:...................................................................................................................................43
Natural language processing...........................................................................................................43
Why Should Businesses Use Natural Language Processing?.........................................................43
How Can audience Use Natural language processing.....................................................................44
1) Improve user experience.......................................................................................................44
2) Automate support.................................................................................................................44
3) Monitor and analyze feedback..............................................................................................44
Communicate research outcomes in an appropriate manner for the intended audience..............44
Email filters...................................................................................................................................45
Smart assistants............................................................................................................................45
Search results...............................................................................................................................45
Predictive text..............................................................................................................................46
Language translation....................................................................................................................46
Digital phone calls.........................................................................................................................46
Data analysis.................................................................................................................................47
Text analytics................................................................................................................................47
Conclusion:......................................................................................................................................48
LO4: Reflect on the application of research methodologies and concept..........................................49
Introduction:...................................................................................................................................49
Reflect on the effectives of research methods applied for meeting objectives of the business
research project...............................................................................................................................49
The relationship between AI and natural language processing.....................................................49
Effectives of research methods applied for meeting objectives of the business research project:. 50
1. Neural machine translation.......................................................................................................50
2. Chatbots...................................................................................................................................51
3. Hiring tools...............................................................................................................................51
4. Conversational search...............................................................................................................52
Advantages of Secondary Researsh................................................................................................52
Consider alternative research methodologies and lessons learnt in view of the outcomes............53
Alternative research methodologies and lessons learnt in view of the outcomes:..........................53
1. DISTRIBUTIONAL methodology:................................................................................................53
2. FRAME-BASED methodology:....................................................................................................55
3. MODEL-THEORETICAL methodology:........................................................................................56
4. INTERACTIVE LEARNING............................................................................................................58
Conclusion:......................................................................................................................................60
References:......................................................................................................................................62
LO1: Examine appropriate research methodologies and approaches as part of the research
process
Introduction: before starting the assignment I choose the research topic. The name of my
research topic is AI in digital marketing. So firstly I am going to write a research proposal by
submitting many points. After that I am going to examine the research approaches including
primary and secondary research.
Produce a research proposal that clearly defines a research question or hypothesis
supported by a literature review
Research proposal
Student name: prabin bhusal

Centre name: Chitwan, Bharatpur

Tutor: Savita Havalkod

Unit: Computing Research Project

Research Topic: AI in computer vision

Abstract: The field of artificial intelligence, and its application in day-to day life, has seen
remarkable evolution in the past three to five years. Artificial intelligence (AI) is an enabler that
potentially facilitates machines doing everything that humans can do. Natural language
processing is a branch of computer science and artificial intelligence which is concerned with
interaction between computers and human languages. Natural language processing is the study of
mathematical and computational modeling of various aspects of language and the development
of a wide range of systems. These includes the spoken language systems that integrate speech
and natural language. Natural language processing has a role in computer science because many
aspects of the field deal with linguistic features of computation. Natural language processing is
an area of research and application that explores how computers can be used to understand and
manipulates natural language text or speech to do useful things. The applications of Natural
language processing includes fields of study, such as machine translation, natural language text
processing and summarization, user interfaces, multilingual and cross language information
retrieval (CLIR), speech recognition, artificial intelligence(AI) and expert systems.

Research introduction: NLP is a way for computers to analyze, understand, and derive meaning
from human language in a smart and useful way. By utilizing NLP, developers can organize and
structure knowledge to perform tasks such as automatic summarization, translation, named entity
recognition, relationship extraction, sentiment analysis, speech recognition, and topic
segmentation. “Apart from common word processor operations that treat text like a mere
sequence of symbols, NLP considers the hierarchical structure of language: several words make
a phrase, several phrases make a sentence and, ultimately, sentences convey ideas,” John
Rehling, an NLP expert at Meltwater Group, said in How Natural Language Processing Helps
Uncover Social Media Sentiment. “By analyzing language for its meaning, NLP systems have
long filled useful roles, such as correcting grammar, converting speech to text and automatically
translating between languages.” NLP is used to analyze text, allowing machines to understand
how human’s speak. This human-computer interaction enables real-world applications
like automatic text summarization, sentiment analysis, topic extraction, named entity
recognition, parts-of-speech tagging, relationship extraction, stemming, and more. NLP is
commonly used for text mining, machine translation, and automated question answering. NLP is
characterized as a difficult problem in computer science. Human language is rarely precise, or
plainly spoken. To understand human language is to understand not only the words, but the
concepts and how they’re linked together to create meaning. Despite language being one of the
easiest things for the human mind to learn, the ambiguity of language is what makes natural
language processing a difficult problem for computers to master.

Aims and Objectives

The aims of natural language processing is to specify a language comprehension and production
theory to such a level of detail that a person is able to write a computer program which can
understand and produce natural language. The basic goal of NLP is to accomplish human like
language processing. The choice of word “processing” is very deliberate and should not be
replaced with “understanding”. For although the field of NLP was originally referred to as
Natural Language Understanding (NLU), that goal has not yet been accomplished. A full NLU
system would be able to:

 Paraphrase an input text.


 Translate the text into another language.
 Answer questions about the contents of the text.
 Draw inferences from the text.

  

While NLP has made serious inroads into accomplishing goals from first to third, the fact that
NLP system can not, of themselves, draw inferences from text, NLU still remains the goal of
NLP. Also there are some practical applications of NLP. An NLP-based IR system has the goal
of providing more precise, complete information in response to a user’s real information need.
The goal of the NLP system is to represent the true meaning and intent of the user’s query, which
can be expressed as naturally in everyday language.

NLP lie in a number of disciplines like computer and information sciences, linguistics,
mathematics, electrical and electronic engineering, artificial intelligence and robotics,
psychology etc. Applications of NLP include a number of fields of studies such as machine
translation, natural language text processing, summarization, user interfaces multilingual and
Gross language information retrieval (CLIR), speech recognition, artificial intelligence and
expert system. Research on NLP is regularly published in a number of conferences such as the
annual proceedings of ACL (Association of Computational Linguistics) and its European counter
part EACL, biennial proceedings of the Message Understanding Conferences (MUCS), Text
Retrieval Conferences (TRECS) and ACM-SIGIR (Association of Computing Machinery-
Special Interest Group on Information Retrieval) conferences. As natural language processing
technology matures, it is increasingly being used to support other computer applications. Such
use naturally falls into two areas, one in which linguistic analysis merely serves as an interface to
the primary program and the second one in which natural language considerations are central to
the application. Natural language interfaces into a request in a formal database query language,
and the program then proceeds as it would without the use of natural language processing
techniques. The design of question answering systems is similar to that for interfaces to database
management systems. One difference however, is that the knowledge base supporting the
question answering system does not have the structure of a database. Similarly in message
understanding systems, a fairly complete linguistic analysis may be required but the messages
are relatively short and the domain is often limited. Also some more application areas include
information and text categorization. In both applications, natural language processing imposes a
linguistic representation on each document being considered. In text categorization a collection
of documents is inspected and all documents are grouped into several categories based on the
characteristics of the linguistic representations of the documents. In information filtering
documents satisfying some criterion are singled out from a collection.

Reason for choosing this research project

Almost every business today is looking to embrace AI and reap the advantages of its subsets with
an intelligence-driven system that captures, processes and synthesizes data resulting in
automated data analysis as well as content management. Despite the tremendous success and
adoption of Big Data, research shows that only 20% of employees with access to business
intelligence tools have literacy or enough domain expertise to utilize them. On the other hand,
data presented through charts and graphs do not appear eye-friendly, often leading to
misinterpretation and poor decision making. This is where the subset of AI technologies
– Natural Language Processing, Natural Language Understanding and Natural Language
Generation – and their analytical algorithms come into the picture.tasks. This can help you
increase productivity and save you both time and money.

The reson why I chooses AI in Natural Language Generation go beyond the usual perception that
people have when it comes to AI adoption. Some of its reasons are given below.

1. Automated Content Creation

What NLG is mainly capable of is its ability to create on organized structure of data from the
information processed in previous stages of NLP and NLU.  By placing this well-structured data
in a carefully configured template, NLG can automate the output and supply documentable form
of data such as analytics reports, product description, data-centric blog posts, etc. In such case,
algorithmically programmed machines are at complete liberty to create content in a format as
desired by content developers. The only thing left for them to do then is to promote it to the
target audience via popular media channels. Thus, Natural Language Generation fulfils two
purposes for content developers & marketers:

1. Automation of content generation &


2. Data delivery in the expected format

Content Generation revolves around web mining and relies on search engine APIs to develop
effective content made from using various online search results and references.

So far, several NLG-based text report generation systems have been built to produce textual
weather forecast reports from input weather data.

Additionally, a firm destined to generate accurate weather forecast reports will be able to
translate the statistical structure of weather forecast data into an organized, reader-friendly
textual format using the real-time analytical power of Natural Language Generation.

2. Significant Reduction in Human Involvement

With Natural Language Generation in place, it becomes inessential to hire data-literate


professionals and train them for the job they do. So far, as corporate theories go, human force is
key to understanding consumer’s interests, their needs and converting them in written stories.

However, with Natural Language Generation, machines are programmed to scrutinize what
customers want, identify important business-relevant insights and prepare the summaries around
it.

The value of NLG is doubled after realizing how expensive and ineffective it is to employ people
who spend hours in understanding complex data. Even Gartner predicts that 20% of business
content will be authored through machines using Natural Language Generation and will be
integrated into major smart data discovery platforms by 2018. Legal documents, shareholder
reports, press releases or case studies will no longer require humans to create.
3. Predictive Inventory Management
The success of inventory management for any store results in a great boost in terms of business
goals and overall resultant profit given that certain products have very high margins. Data
matters most and plays a key role in areas such as supply chain, production rate and sales
analytics. Based on this information, store managers can make decisions about maintaining
inventory to its optimal levels. However, it is not reliable to always expect managers to be sound
with data and interpret them efficiently.

When it comes to advanced NLG, it can work as an interactive medium for data analysis and
makes the overall reporting process seamless and insightful. Instead of having to go through
several charts and bar graphs of data, store managers get clear narratives and analysis in desired
format telling them whether or not they require specific item next week. With natural language
generation, managers have the best predictive model with clear guidance and recommendations
on store performance and inventory management.

4. Performance Activity Management at Call Centre

It is prudent to conduct performance reviews and accurate training for further improvements
within a call centre. However, as covered in the above use cases, charts won’t help much in
communicating the exact pain points and areas of improvement unless it has strong narratives in
form of feedback. This is where the advantages of Natural Language Generation accompanied
with NLP lies.

NLG can be strategically integrated in major call centre processes with in-depth analysis of call
records and performance activities to generate personalized training reports. It can clearly state
just how call centre employees are doing, their progress and where to improve in order to reach a
target milestone.

Literature Review of AI in natural Language processing: The research work in the natural
language processing has been increasingly addressed in the recent years. The natural language
processing is the computerized approach to analyzing text and being a very active area of
research and development. The literature distinguishes the main application of natural language
processing and the methods to describe it. Natural language processing for Speech Synthesis is
based on the text to speech conversion i.e (TTS) in which the text data is the first input into the
system. It uses high level modules for speech synthesis. It uses the sentence segmentation which
deals with punctuation marks with a simple decision tree. Natural language processing for
Speech Recognition is Automatic speech recognition system make use of natural language
processing techniques based on grammars. It uses the context free grammars for representing
syntax of that language presents a means of dealing with spontaneous through the spotlighting
addition of automatic summarization including indexing, which extracts the gist of the speech
transcriptions in order to deal with Information retrieval and dialogue system issues.

A paper by (Schutze, et.al ,1998) “Foundations of Natural Language Processing”, focuses on


following interesting areas in NLP as listed below.  Text mining, natural language processing
and information extraction.  Text information system and information retrieval.  Text
categorization methods.  Mining Web linkage structures. The paper throws an important idea
on contexts , i.e “Humans rely on context to interpret” (when possible). This context may extend
beyond a given document! Further, word level ambiguity is can be studied with a better focus. 3
Word level ambiguity, syntactic ambiguity, anaphora resolution and pre supposition methods are
discussed. The following observations are noted in this paper.

1. Text databases are called as document databases, which can contain large collections of
documents from various sources: news articles, research papers, books, digital libraries,
e-mail messages, and Web pages, library database for today’s knowledge mining
requirements etc,.
2. Data stored is usually semi structured.
3. A field developed in parallel with database systems Information is organized into (a large
number of) documents, in which information retrieval activity concentrates on locating
relevant documents based on user input, such as keywords or example documents in the
knowledge source.
4. The modern typical IR systems consists of Online library catalogs, Online document
management systems and web related knowledge sources.
5. Information retrieval vs. database systems : Here it is observed that, Some Database
problems are not present in IR, e.g., update, transaction management; like some complex
objects and IR problems are not addressed well in DBMS, e.g., unstructured documents,
approximate search using keywords and relevance can be considered as considered as
difficult problems.
6. Shallow linguistic approaches concentrate on following points.
 English Lexicon
 Part-of-Speech Tagging
 Word Sense Disambiguation
 Phrase Detection / Parsing

Activities and Timescales of Smart home project proceed:


In the given diagram provide timescale of natural language processing in AI research project in
which divide of complete phase based on time for completion. There is provide 25 weeks’ time
for development phase because development phases of research project for implement of data
resources and analysis of apply method. Research project is full depend for development phase
so we must provide exact time for development. There is provide 2 weeks for data gathering for
based on qualitative and quantitative research methodology are provide different data resource
format such as theatrical and other is numerical. Both data resource is implementing in research
project. There is assign 2weeks, 4weeks, and 1weeks for regular discussion will tutor because
research project process is going through suggestion and guidance of tutor. Finally, there is
provides 2 weeks for prepare of final report of research project and there is mention individual
result and outcomes of research phases.

Research approach and Methodology: In order to conduct this research, the researcher adopted
a qualitative research method. Qualitative method is primarily exploratory research which is
adapted to gain an understanding of the reasons, perspectives, and opinions to solve the research
problem. Since the objective of the research is to include the perspective of the professionals to
know about the impact of AI in natural language processing, qualitative research is the best
choice. For data collection, the research is including primary as well as secondary sources. The
researcher collected primary data as a first time getting data to solve the research problem and
this information was gathered from the interview method. In addition, different articles, journals,
books, websites and blogs are included as a secondary data source.

Research approaches implement in AI digital marketing:

 Primary Research

 Secondary Research.

Under this approaches I have chosen two methodologies: -

1)Qualitative Research Method

2)Quantitative Research Method.


a) Primary Research: - Primary research is a methodology used by researchers to gather data
directly, instead of depending on data collected from previously done research. It is carried out to
find out problems, which requires in-depth analysis. Organizations can themselves conduct
primary research or can employ a third party to conduct research on their behalf. One major
advantage of primary research is, this type of research is “pinpointed”, research is carried around
only a particular problem and all the focus is directed to obtain related solutions.

1)Qualitative Research Method: - Qualitative research is a market research method that


focuses on obtaining data through open-ended and conversational communication. This method
is not only about “what” people think but also “why” they think so. For example, consider a
convenience store looking to improve its business. A systematic observation concludes that the
number of men visiting this store are more. One good method to determine why women were not
visiting the store is to conduct an in-depth interview of potential customers in the category.

2)Quantitative Research Method: - Quantitative research is a scientific investigation of


phenomena by gathering quantifiable data and performing statistical, mathematical, or
computational techniques. It gather information from existing and potential customers
using sampling methods and sending out online surveys, questionnaires, etc., the results of which
can be pictured in the form of numerical. After careful understanding of these numbers to predict
the future of a product or service and make changes accordingly.

b) Secondary Research: - (QuestionPro, 2019) Secondary research is a research method that


involves using already existing data. Existing data is summarized to increase the overall
effectiveness of research. It includes research material published in research reports and similar
documents. These documents can be made available by public libraries, websites, data obtained
from already filled in surveys etc. Government and non-government agencies also store data, that
can be used for research purposes and can be retrieved from them. It is much more cost-effective
than primary research, as it makes use of already existing data, unlike primary research where
data is collected first hand by organizations. They can employ a third party to collect data on
their behalf.
 Comments and agreement for tutor -

Comment (optional) –

Agreed:…………….

Name:…………………

Date:………………….

Examine appropriate research methods and approaches to primary and secondary


research.
Introduction: - According to the scenario, I work as a trainee as an IT Security Specialist for a
leading IT company. My duty is to complete a computing research project based on the theme
“AI” (Artificial intelligence). In this project I will show the complete computing research project
engaging myself in a specific field. In this section, I will mention the appropriate research
methods and approaches to primary and secondary research. So, I will try to mention all about
primary and secondary research.

Research methods type:


1. Primary Research: Popular culture is rife with images of the solitary scientist locked up
in her laboratory, combining the contents of test tubes or prodding a lab rat through a
maze. Usually, such a scientist is engaged in one method of primary research called
experimentation, in which a researcher will set up a series of tests or demonstrations in
the controlled setting of a lab in order to test his or her hypothesis. What isn’t made
evident in popular culture is that scientists, scholars, and researchers can actually choose
to engage in a variety of different forms of primary research, depending on their field of
study and the kind of knowledge they want to discover. Other examples of primary
research methods include observation, interviews, focus groups or panels, surveys,
and ethnography. In this class, you will probably only conduct primary research using a
couple of these methods. But learning about each of them will give you a better
understanding of the kinds of research that scholars and experts might do. And since
scholars and researchers also write to report their primary research, it will also help you
better understand the studies, reports, and articles you find when you do secondary
research. (Courses.lumenlearning.com, 2020)

Types of Primary Research Methods


Here are some of the primary research methods organizations or businesses use to collect data:

Observation

This method involves going out in the world and watching, using your five senses to collect data.
This method was used in the first writing project for this class, where students examined the
contents and rhetorical features of a film documentary in order to come to conclusions about the
arguments made in it.

Here are other ways observation might be used:

 One might observe a group or organization, exploring how business is conducted or how
people in the group communicate

 One might observe artwork or other man-made or natural objects in order to interpret that
artwork.

 One might view and record observations from several people’s Facebook pages to
examine how this kind of social media commonly gets used.

 One might observe memorial spaces in public parks at various times in a day to record
how the public makes use of those spaces.

Observation is great for inquiry in which you either can’t ask questions (for instance, a
monument or painting won’t talk back) or because you want to collect information on how
something works without interfering by participating yourself or asking questions for which you
may or may not receive the best answers.  At the same time, observation means you can only
observe one or a few examples, thus it is hard to say that anything you observed is true for most
or all situations.

Quantitative: it is observation usually involves tallying – simply making a mark every time the
phenomena you are observing happens. This allows you to calculate the frequency or number of
anything being observed. To do this, you must select periods of time in which to collect data and
decide beforehand a certain set behaviors or phenomena you will count during each observation
period. After that, of course, you must observe and tally those behaviors or phenomena.  After
you collect these numerical results you can interpret the data and evaluate it in terms of your
research question(s). Whichever kind of observation you perform, it will require you to make a
plan with to decide what kinds of things you will look for when you observe (what kind of
phenomena fits the bill for the research question you’re trying to answer).  It will also often
require that you plan certain times and/or place in which to do your observations.  This is
especially the case when you plan to observe things that happen a different times or day and/or in
various locations.

When doing observational research, it can often be useful to record what you are observing,
either photographing or video-recording it.  This is useful because it allows you to look at it
again and again. Keep in mind, though, that if a researcher records people in a way that would
make them identifiable by others, they must gain permission to use those images or footage from
the individuals recorded. (Courses.lumenlearning.com, 2020)

Focus groups: 

This popular research technique is used to collect data from a small group of people, usually
restricted to 6-10. Focus group brings together people who are experts in subject matter, for
which research is being conducted. Focus group has a moderator who stimulates discussions
among the members to get greater insights. Organizations and businesses can make use of this
method especially to identify niche market to learn about a specific group of consumers.
(Courses.lumenlearning.com, 2020)

Interview
Interviews involve one-on-one sessions with individuals, in which you ask open-ended
questions. You collect their broad, open-ended answers much like you do with observation,
without coming to conclusions or assumptions about what the person is saying. Only afterward
do you analyze the questions and relate it to the subject and your research question. Interviews
come in a couple different varieties. One version is a representative interview, in which you
interview people who are affected by or experience a certain problem or issue. Another version is
an expert interview, in which you interview people who are experts, scholars, professors, or
professionals in a field related to your topic of research. It is even possible to combine interviews
with observation, by asking interview respondents to view something (like a video or a set of
images) and then ask questions about what they think or noticed about the phenomena or
artifacts they viewed.

Interviews are quintessentially qualitative, leading to complex understandings and viewpoints of


one or a small group of people.  Generally, the answers are in depth and nuanced because the
respondent has some time to construct his or her answers carefully and add clarification if
needed.  Another good thing about interviews is that they allow you develop specific questions
tailored to the individuals you are interviewing and to change your questions or come up with
new questions based on the respondent’s previous answers.  In this way, the interview becomes
sort of a conversation; the information you collect adjusts and changes according to what you
discover at that moment.  Other the other hand, interviewing limits the number of people from
whom you can collect information, so it isn’t as good for coming to conclusions about what most
or all people think. (Courses.lumenlearning.com, 2020)

Surveys

Surveys involve developing a series of short, easy to answer, multiple choice or multiple answer
questions that are distributed to a large number of people. Usually, surveys are used to
collect quantitative data; a researcher will total up each kind of answer for each question and
calculate the mean (average), median (middle), and mode (most common) of those answers. As
well, other statistical analysis can be done on survey data to mathematically determine how
significant or remarkable certain answers are. In any case, the numerical data collected from a
survey is then interpreted, looking for answers they provide to research question(s).
Surveys are great for collecting information about large groups of people, since you can
distribute surveys widely, collect them as a group, quickly total up answers, and do calculations. 
Because of this, you can begin to make conclusions based on how representative your survey
sample is of the larger group you are investigating.  A good survey sample means that you can
assume that even people you didn’t survey will likely answer in the same way as those you did
survey. On the other hand, you cannot collect very complex information through a survey since
the people who take the survey are automatically limited in the kinds of answers they can give
and the questions and answers have to remain general enough to refer to and be understandable
by all people. (Courses.lumenlearning.com, 2020)

Primary Research Advantages and Disadvantages

The diagram below explains the advantages and disadvantages of each type of primary research
that we have discussed in this article.
Secondary research method: By far the most widely used method for collecting data is
through secondary data collection, commonly called secondary research. This process involves
collecting data from either the originator or a distributor of primary research. In other words,
accessing information already gathered. In most cases this means finding information from third-
party sources, such as industry research reports, company websites, magazine articles, and other
sources. But in actuality any information previously gathered, whether from sources external to
the marketer or from internal sources, such as accessing material from previous market research
carried out by the marketer’s organization, old sales reports, accounting records and many others,
falls under the heading of secondary research. (Courses.lumenlearning.com, 2020)
Types of Secondary Research Methods

 Public sources: These are usually free, often offer a lot of good information, and include
government departments, business departments of public libraries, and so on.
 Commercial sources: These are valuable, but usually involve cost factors such as
subscription and association fees. Commercial sources include research and trade
associations, such as Dun & Bradstreet and Robert Morris & Associates, banks and other
financial institutions, and publicly traded corporations. (Courses.lumenlearning.com,
2020)
 Educational institutions: These are frequently overlooked as valuable information
sources even though more research is conducted in colleges, universities, and technical
institutes than virtually any sector of the business community.

Advantages and disadvantages of secondary research:

Advantages Disadvantages

 Already gathered so may be  Information may be outdated, therefore inaccurate.


quicker to collect.
 The data may be biased and it is hard to know if
 May be gathered on a much the information was collected is accurate.
larger scale than possible for the
 The data was not gathered for the specific purpose
firm.
the firm needs or is not relevant to the original
 In some cases it can be very context.
cheap or free to access.
 In some cases it can be costly.e.g marketing firm
reports.

Pros And Cons Of Primary Vs Secondary Research


Every market research method, whether it is categorized as a primary or secondary method, has
positive aspects and drawbacks. Generally speaking, secondary research is where most
researchers should begin when opening a new research project. Whether primary research is
necessary or not, secondary research is a valuable step in the market research process. Secondary
research is worthwhile because it is generally more cost-effective than primary research and it
provides a foundation for any project. Evaluating the current landscape of available information
before moving on to primary research methods can save time and money that may be better spent
elsewhere. (QuestionPro, 2019)

The main limitations of secondary research are associated with chance. Depending on the
research questions, there may or may not be information available that provides concrete
answers. If there is not enough information from past studies, it may be necessary to funnel time
and money into primary methods of research. Subsequent primary research, when necessary,
should be planned out carefully in advance. The purpose of primary research is to answer
specific questions that accomplish a project’s research goals. The specific nature of answering
questions tailored to individual needs is one reason why primary research is valuable. Timeline
and budget restrictions may be limiting factors for primary research, but planning ahead is
worthwhile for the valuable information that this method can provide.

Difference between primary and secondary research:

BASIS FOR
PRIMARY RESEARCH SECONDARY RESEARCH
COMPARISON

Meaning Research conducted to gather first- Secondary Research is one that


hand information, for the current involves use of information
problem is called Primary gathered originally by primary
Research. research.

Based on Raw data Analysed and interpreted


information

Carried on by Researcher himself Someone else

Data Specific to the needs of researcher. May or may not be specific to the
BASIS FOR
PRIMARY RESEARCH SECONDARY RESEARCH
COMPARISON

needs of researcher.

Process Very Involved Rapid and Easy

Cost High Low

Time Long Short

You can also find out the difference between primary and secondary research, in the following
points in detail:

1. Research conducted to gather first-hand information, for the current problem is called
Primary Research. Secondary Research is one that involves the use of information
obtained originally by primary research.

2. Primary Research is based on raw data, whereas secondary research is based on analysed
and interpreted information.

3. The primary research, the data is collected by the researcher himself or by the person
hired by him. As against this, the secondary research, the data collection is performed by
someone else.

4. The primary research process is very involved which deeply explores the topic.
Conversely, the secondary research process is fast and easy, which aims at gaining broad
understanding about the subject.

5. In primary research, as the researcher conducts the research, the data collected is always
specific to the needs of the researcher. As opposed to secondary research, wherein the
data lacks particularity, i.e. it may or may not be as per the requirements of the
researcher.
6. Primary research is an expensive process; wherein high cost is involved in the
exploration of data and facts from various sources. Unlike Secondary research, is an
economical process wherein the low cost is involved in acquiring pertinent information
because the data is already collected by someone else.

7. Primary research consumes a lot of time as the research is done from scratch. However,
in the case of secondary research, the collection of data is already done, the research
takes comparatively less time.

Conclusion: Finally, I have mention the appropriate research methods and approaches to
primary and secondary research. So, I will try to mention all about primary and secondary
research. I will briefly introduce the definition, example and how to do the research, its
advantages and disadvantages. I will further evaluate research methodologies and processes in
application to a computing research project to justify chosen research methods and analysis. The
emergence of artificial intelligence (AI) has changed the dynamics of the business world. One of
the significant applications of AI is in the marketing field that helps in improved performance.
The present research is aimed to find out the impact of AI in marketing by including the
perspective of marketing professionals of Nepal.
LO2: Conduct and analyze research relevant for a computing research project.
Introduction: after finishing lo1, in this task, I am going to Conduct primary and secondary
research using appropriate methods for a computing research project that consider costs, access
and ethical issues Applying appropriate analytical tools.
Conduct primary and secondary research using appropriate methods for a computing
research project that consider costs, access and ethical issues.
Conducting Secondary Research
Conducting secondary research is similar to the research that students conduct throughout
primary school. Answers to research questions are already available online, in academic
databases, the news, published books, journals, etc.—the work is in wading through the
information that is already available and finding data that coincides with the particular research
project. The volume of information available on a particular topic may be overwhelming at the
beginning of the secondary research process. Research questions should be used to guide the
researcher as they focus on finding project-specific information. The best source to answer a
particular research question may vary widely, and a single project will likely require more than
one source. (Anon, 2020)

Conducting Primary Research


The purpose of primary research is to gather information and answer questions that have not
been asked before. Primary research is typically more time consuming and has higher associated
costs, so it is in the best interest of an organization to only conduct primary research after the
gaps in available secondary research have been identified. Primary research should be conducted
only after comprehensive secondary research is completed. This is important to note because
primary research uses more resources than secondary research. In primary research, the research
team is in charge of everything from choosing the best method to reach a desired audience, to
what specific metrics should be measured. Conducting secondary research beforehand is
necessary to determine what information is not already available so time and money is not
wasted on redundant primary research. (Anon, 2020)
Step to Choosing sources by conducting best appropriate method:
The following steps outline a simple and effective strategy for writing a research paper.
Depending on your familiarity with the topic and the challenges you encounter along the way,
you may need to rearrange these steps. (Anon, 2020)

Step 1: Identify and developing research topic

Selecting a topic can be the most challenging part of a research assignment. Since this is the very
first step in writing a paper, it is vital that it be done correctly. Here are some tips for selecting a
topic: (Anon, 2020)

1) Select a topic within the parameters set by the assignment. Many times your instructor
will give you clear guidelines as to what you can and cannot write about. Failure to work
within these guidelines may result in your proposed paper being deemed unacceptable by
your instructor.
2) Select a topic of personal interest to you and learn more about it. The research for and
writing of a paper will be more enjoyable if you are writing about something that you
find interesting.
3) Select a topic for which you can find a manageable amount of information. Do a
preliminary search of information sources to determine whether existing sources will
meet your needs. If you find too much information, you may need to narrow your topic; if
you find too little, you may need to broaden your topic.
4) Be original. Your instructor reads hundreds of research papers every year, and many of
them are on the same topics (topics in the news at the time, controversial issues, subjects
for which there is ample and easily accessed information). Stand out from your
classmates by selecting an interesting and off-the-beaten-path topic.
5) Still can't come up with a topic to write about? See your instructor for advice.

Step 2 : Do a preliminary search for information


Before beginning your research in earnest, do a preliminary search to determine whether there is
enough information out there for your needs and to set the context of your research. Look up
your keywords in the appropriate titles in the library's Reference collection (such as
encyclopedias and dictionaries) and in other sources such as our catalog of books, periodical
databases, and Internet search engines. Additional background information may be found in your
lecture notes, textbooks, and reserve readings. You may find it necessary to adjust the focus of
your topic in light of the resources available to you. (Anon, 2020)

Step 3: Locate materials

With the direction of your research now clear to you, you can begin locating material on your
topic. There are a number of places you can look for information: If we are looking for books, do
a subject search in the Alephcatalog. A Keyword search can be performed if the subject search
doesn't yield enough information. Print or write down the citation information (author, title,etc.)
and the location (call number and collection) of the item(s). Note the circulation status. When
you locate the book on the shelf, look at the books located nearby; similar items are always
shelved in the same area. The Aleph catalog also indexes the library's audio-visual holdings.

Use the library's electronic periodical databases to find magazine and newspaper articles.
Choose the databases and formats best suited to your particular topic; ask at the librarian at the
Reference Desk if you need help figuring out which database best meets your needs. Many of the
articles in the databases are available in full-text format. (Anon, 2020)

Step 4: Evaluate sources

See the CARS Checklist for Information Quality for tips on evaluating the authority and quality
of the information you have located. Your instructor expects that you will provide credible,
truthful, and reliable information and you have every right to expect that the sources you use are
providing the same. This step is especially important when using Internet resources, many of
which are regarded as less than reliable.

Step 5: Make notes


Consult the resources you have chosen and note the information that will be useful in your paper.
Be sure to document all the sources you consult, even if you there is a chance you may not use
that particular source. The author, title, publisher, URL, and other information will be needed
later when creating a bibliography. (Anon, 2020)

Step 6: Write paper

Begin by organizing the information you have collected. The next step is the rough draft,
wherein you get your ideas on paper in an unfinished fashion. This step will help you organize
your ideas and determine the form your final paper will take. After this, you will revise the draft
as many times as you think necessary to create a final product to turn in to your instructor.

Step 7: Cite sources properly

Citing or documenting the sources used in your research serves two purposes: it gives proper
credit to the authors of the materials used, and it allows those who are reading your work to
duplicate your research and locate the sources that you have listed as references. The MLA and
the APA Styles are two popular citation formats. (Anon, 2020)

Step 8: Proofread

The final step in the process is to proofread the paper you have created. Read through the text
and check for any errors in spelling, grammar, and punctuation. Make sure the sources you used
are cited properly. Make sure the message that you want to get across to the reader has been
thoroughly stated. (Anon, 2020)

Cost for conducting research: Cost estimating is the practice of forecasting the cost of
completing a project with a defined scope. It is the primary element of project cost management,
a knowledge area that involves planning, monitoring, and controlling a project’s monetary costs.
(Project cost management has been practiced since the 1950s.) The approximate total project
cost, called the cost estimate, is used to authorize a project’s budget and manage its costs.
Professional estimators use defined techniques to create cost estimates that are used to assess the
financial feasibility of projects, to budget for project costs, and to monitor project spending. An
accurate cost estimate is critical for deciding whether to take on a project, for determining a
project’s eventual scope, and for ensuring that projects remain financially feasible and avoid cost
overruns. Cost estimates are typically revised and updated as the project’s scope becomes more
precise and as project risks are realized — as the Project Management Body of
Knowledge (PMBOK) notes, cost estimating is an iterative process. A cost estimate may also be
used to prepare a project cost baseline, which is the milestone-based point of comparison for
assessing a project’s actual cost performance. (Smartsheet, 2020)

 Direct costs are broadly classified as those directly associated with a single area (such as
a department or a project). In project management, direct costs are expenses billed
exclusively to a specific project. They can include project team wages, the costs of
resources to produce physical products, fuel for equipment, and money spent to address
any project-specific risks.
 Indirect costs, on the other hand, cannot be associated with a specific cost center and are
instead incurred by a number of projects simultaneously, sometimes in varying amounts.
In project management, quality control, security costs, and utilities are usually classified
as indirect costs since they are shared across a number of projects and are not directly
billable to any one project.

A cost estimate is more than a simple list of costs, however: it also outlines the assumptions
underlying each cost. These assumptions (along with estimates of cost accuracy) are compiled
into a report called the basis of estimate, which also details cost exclusions and inclusions. The
basis of estimate report allows project stakeholders to interpret project costs and to understand
how and where actual costs might differ from approximated costs.

Beyond the broad classifications of direct and indirect costs, project expenses fall into more
specific categories. Common types of expenses include:

 Labor: The cost of human effort expended towards project objectives.


 Materials: The cost of resources needed to create products.
 Equipment: The cost of buying and maintaining equipment used in project work.
 Services: The cost of external work that a company seeks for any given project (vendors,
contractors, etc.).
 Software: Non-physical computer resources.
 Hardware: Physical computer resources.
 Facilities: The cost of renting or using specialized equipment, services, or locations.
 Contingency costs: Costs added to the project budget to address specific risks.

Ways we Can Reduce Research Costs: There are many factors and interrelationships between
the factors that affect market research costs. Everyone’s budget is limited, so save your research
funds for the insights that are critical to your key business or organizational decisions. Also,
think about other ways to gain the insights you want. For example, if you want to determine what
promotion will have the greatest appeal to prospective customers, test some messaging in online
posts and ads and see which ones generate more web or social media traffic. Don’t use a focus
group or survey for the kinds of findings you can test relatively inexpensively out in the market
(as long as it won’t potentially cause harm to your brand). If you want to understand how your
staff is treating customers, write a quick three-question survey with Survey Monkey or another
free tool and send it to your customers right after they visit your business or meet with your sales
rep. On the other hand, if you want to determine what motivates customers to select you or
identify how you could increase your share of their purchasing, a more formal, comprehensive
research study would be appropriate. (Smartsheet, 2020)

Make the most of your research dollars with these tips:

 Be very clear and specific with your research goals and how you intend to use the results
– this will help focus the research and reduce unnecessary costs.

 Know who within your organization will use the research results so you can determine
the key information that will be most useful.

 Allow enough time in the process for research decisions to be made before research
begins, time for the research to be properly conducted and analyzed and time to process
the results before sharing with the decision-maker(s).
 Take your results and develop different presentations with the information of most
interest to key functional teams (e.g., sales, marketing, customer service) or levels within
your organization ( e.g., department leaders, c-suite).

 Make sure everyone who will be using the results is able to assist with making the
research decisions, even wording of questions, so that the research includes information
they will find usable and making it less likely they will disregard the results.

 Obtain customer and prospect emails and update them regularly – it will make data
collection easier and reduce costs if you are using outside research firms.

 Make sure customers who sign up for loyalty programs, text or email alerts give you
permission to contact them for research purposes in their confidentiality agreement sign-
off.

 As appropriate, use your social media pages, website, enewsletters and all other
communication vehicles and customer interactions to gather opinions and test ideas •
Consider non-monetary items you can offer to research participants, e.g., one-time
discount, free consultation, loyalty program points, copy of research results, product
sample.

 Don’t over-survey your customers, as it will reduce future participation rates.

 If you conduct customer or employee research, make sure you communicate back to that
population what you learned (high-level) and how you are using those findings to make
improvements – this is the most important thing you can do to boost participation (and
reduce costs) in future research.

 Build a key customer group or panel that is interested in evaluating or co-creating


products, services, ideas, etc. for the opportunity to influence (you might want to reward
them with recognition or an annual gift).
 Make sure you are consolidating and analyzing the data you already collect– sales data,
website analytics, loyalty program activity – to gain insights and guide additional
research.

 Consistently collect and update your customer information, especially email and phone
number.

Major ethical issues in conducting research


Ranking: It is common to provide a ranking among entities with values that can be ordered.
However, when there are a small number of entities, ranking may not be informative especially if
the size of the set is not also given. For example, if there is only one country reporting in a region
for a particular indicator an NLG engine could claim that the country is either the highest or
lowest in the region. A region like North America, for which World Bank lists Bermuda,
Canada, and the United States will sometimes only have data for 2 countries as Bermuda is
dramatically smaller, so clarity in which countries are being compared for a given indicator and
timespan is essential. Time series: Enterprise applications will usually contain Terms of Use of
products stating that data may be incomplete and calculations may include missing points.
However, users may still assume that content shown by an application is authoritative leading to
a wrong impression about the accuracy of the data. Table 3 shows the life expectancy for
Curac¸ao from 2009-2015. Here we see that 2010, 2012, and 2013 are missing. NLG systems
should check for missing values and should be informed if calculations are performed on data
with missing values or if values presented to the user have been imputed. (Aclweb.org, 2020)

Personalization: One of the advantages of NLG systems is the capability to produce text
customized to the profile of individual users. Instead of writing one text for all users, the NLG
system can incorporate the background and context of a user to increase the communication
effectiveness of the text. However, users are not always aware of personalization. Hence,
insights they may obtain from the text can be aligned with their profile and history, but may also
be missing alternative insights that are weighed down by the personalization algorithm. One way
to address this limitation is to make users aware of the use of personalization, similar to how
provenance can be addressed.
Fraud Prevention: In sensitive financial systems, in theory a rogue developer could introduce
fraudulent code that generates overly positive or negative-sounding sentiment for a company, for
their financial gain. A code audit can bring attempts to manipulate any code base to light, and
pair programming may make any attempts less likely. (Aclweb.org, 2020)

Accessibility: In addition to providing misleading texts, the accessibility of the texts generated
automatically is an additional way in which users may be put in a disadvantaged position by the
use of an NLP system. First, the readability of the generated text may not match the expectations
of the target users, limiting their understanding due to the use of specialized terminology, or
complex structure. (Aclweb.org, 2020)

Issue in conducting research:


Every project is different and unique – however projects that fail usually do so as a result of
similar types of problems. Finding examples of failed projects is not difficult, but making a fair
assessment of the issues that caused failure may not be quite as easy. Projects can be completed
on time and within budget and still fail – if a project does not deliver the expected results and
quality, it can hardly be judged as successful. We’re taking a look at the ten most common
problems that derail projects. (Aclweb.org, 2020)
 Poor Planning – includes not prioritizing effectively, not having a proper business plan,
not breaking down the development into phases. 
 Lack of Leadership – If the Project Manager lacks the relevant business/management
expertise this will lead to poor decision making.
 People Problems – leads to unresolved conflicts which could have a detrimental effect on
the project.  A Project Manager needs expert communication skills to keep everybody on
board and in agreement.
 Vague/Changing Requirements – it’s essential that the project requirements are defined
clearly and completely from the start.  Change requests can cause the project to drift and
miss deadlines.
 Lifecycle Problems – often caused by poor planning or changing requirements.  Initial
testing techniques should be rigorous in order to avoid repeated errors.
 Inefficient Communication Process – it’s vital to keep everybody informed on the project
status at all times.  Lack of efficient communication will lead to errors and delays.
 Inadequate Funding – this issue is most likely to affect projects with changing
requirements.

 Stakeholder Approval – effective stakeholder management is the ability to identify


individuals affected by/likely to affect the successful outcome of the project.  A skilled
project manager will ensure a collaborative working environment where project phases
can be analyzed and discussed by all stakeholders.
 Schedule absence – no Established Schedule for tasks, operational activities and
objectives.
 Missed Deadlines – delays in phases of the project leading to a missed deadline for the
project.
Apply appropriate analytical tools analyze research findings and data.
Introduction: There are mainly two effective types of researching finding tools and techniques
i.e. Qualitative research tool and Quantitative research tool. By using these two main tools we
can get all types of data in adequate amount. These two mainly provide the quanlity of data and
quantity of data respectively. Data collection and analysis tools are defined as a series of charts,
maps, and diagrams designed to collect, interpret, and present data for a wide range of
applications and industries. Various programs and methodologies have been developed for use in
nearly any industry, ranging from manufacturing and quality assurance to research groups and
data collection companies. (Asq.org, 2019)
Data analysis: According to LeCompte and Schensul, research data analysis is a process used by
researchers for reducing data to a story and interpreting it to derive insights. The data analysis
process helps in reducing a large chunk of data into smaller fragments, which makes
sense. Three essential things take place during the data analysis process — the first data
organization. Summarization and categorization together contribute to becoming the second
known method used for data reduction. It helps in finding patterns and themes in the data for
easy identification and linking. Third and the last way is data analysis – researchers do it in both
top-down or bottom-up fashion.

Marshall and Rossman, on the other hand, describe data analysis as a messy, ambiguous, and
time-consuming, but a creative and fascinating process through which a mass of collected data is
being brought to order, structure and meaning. We can say that “the data analysis and
interpretation is a process representing the application of deductive and inductive logic to the
research and data analysis.” (QuestionPro, 2020)

Why analyze data in research?

Researchers rely heavily on data as they have a story to tell or problems to solve. It starts with a
question, and data is nothing but an answer to that question. But, what if there is no question to
ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’ which
often reveal some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data, researchers explore, their mission and audiences’ vision guide them
to find the patterns so they could shape the story they want to tell. One of the essential things
expected from researchers while analyzing data is to stay open and remain unbiased towards
unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most
unforeseen yet exciting stories that were not at all expected at the time of initiating data analysis.
Therefore, rely on the data you have at hand and enjoy the journey of exploratory data analysis in
research.  (QuestionPro, 2020)

Types of data in research


Every kind of data has a rare quality of describing things after assigning a specific value to it.
For analysis, you need to organize these values, processed and presented in a given context, to
make it useful. Data can be in different forms, here are the primary data types. (QuestionPro,
2020)

 Qualitative data: When the data presented has words and descriptions, then we call
it qualitative data. Although you can observe this data, it is subjective and, therefore, harder to
analyze data in research, especially for comparison. 

Example: Quality data represents everything describing taste, experience, texture, or an


opinion is considered as a quality data. This type of data is usually collected through focus
groups, personal interviews, or using open-ended questions in surveys.
 Quantitative data: Any data expressed in numbers of numerical figures are
called quantitative data. This type of data can be distinguished into categories, grouped,
measured, calculated, or ranked. 

Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes
under this type of data. You can present such data in graphical format, charts, or you can apply
statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS
questionnaires in surveys are a significant source of collecting numeric data.

Data analysis in qualitative research


Data analysis and research in qualitative data work a little differently than the numerical data as
the quality data is made up of words, descriptions, images, objects, and sometimes symbols.
Getting insight from such complicated information is a complicated process, hence is typically
used for exploratory research and data analysis. (QuestionPro, 2020)

Finding patterns in the qualitative data

Although there are several ways to find patterns in the textual information, a word-based method
is the most relied and widely used global technique for research and data analysis. Notably, the
data analysis process in qualitative research is manual. Here the researchers usually read the
available data and find repetitive or commonly used words. (QuestionPro, 2020)

 while studying data collected from African countries to understand the most pressing
issues faced by people, researchers might find “food” and “hunger” are the most
commonly used words and will highlight them for further analysis. The keyword context
is another widely used word-based technique. In this method, the researcher tries to
understand the concept by analyzing the context in which the participants use a particular
keyword.  
 researchers conducting research and data analysis for studying the concept
of ‘diabetes’ amongst respondents might analyze the context of when and how the
respondent has used or referred to the word ‘diabetes.’ Scrutiny based technique is also
one of the highly recommended text analysis methods used to identify a pattern in the
quality data. Compare and contrast is the widely used method under this technique to
differentiate how a specific text is similar or different from each other. 
 to find out the “importance of resident doctor in a company,” the collected data is divided
into people who think it is necessary to hire a resident doctor and those who think it is
unnecessary. Compare and contrast is the best method that can be used to analyze the
polls having single answer questions types. Metaphors can be used to reduce the data pile
and find patterns in it so that it becomes easier to connect data with theory. Variable
Partitioning is another technique used to split variables so that researchers can find more
coherent descriptions and explanations from the enormous data.

Methods used for data analysis in qualitative research

There are several techniques to analyze the data in qualitative research, but here are some
commonly used methods, (QuestionPro, 2020)

 Content Analysis: It is widely accepted and the most frequently employed technique for
data analysis in research methodology. It can be used to analyze the documented
information from text, images, and sometimes from the physical items also. It depends on
the research questions to predict when and where to use this method.
 Narrative Analysis: This is a method used to analyze content gathered from various
sources. Here, the source can be personal interviews, field observation, and surveys. The
majority of times, stories or opinions shared by people are focused on finding answers to
the research questions.
 Discourse Analysis: Similar to narrative analysis, discourse analysis is used to analyze
the interactions with people. Nevertheless, this particular method takes into consideration
the social context under which or within which the communication between the
researcher and respondent takes place. In addition to that, discourse analysis also focuses
on the lifestyle and day-to-day environment while deriving any conclusion.
 Grounded Theory: When you want to explain why a particular phenomenon happened,
then using grounded theory for analyzing quality data is the best resort. Grounded theory
is applied to study data about the host of similar cases occurring in different settings.
When researchers are using this method, they might alter explanations or produce new
ones until they arrive at some conclusion.

Data analysis in quantitative research


The first stage in research and data analysis is to make it for the analysis so that the nominal
data can be converted into something meaningful. Data preparation consists of four phases.
(QuestionPro, 2020)

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or
it is a biased data sample again divided into four different stages

 Fraud: To ensure an actual human being records each response to the survey or the
questionnaire
 Screening: To ensure each participant or respondent is selected or chosen in compliance
with the research criteria
 Procedure: To ensure ethical standards were maintained while collecting the data sample
 Completeness: To ensure that the respondent has answered all the questions in an online
survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes
fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process
wherein the researchers have to confirm that the provided data is free of such errors. For that,
they need to conduct necessary checks and outlier checks to edit the raw edit and make it ready
for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation, which is associated with
grouping and assigning values to the survey responses. Suppose a survey is completed with a
1000 sample size, then the researcher will create an age bracket to distinguish the respondents
based on their age. Thus, it becomes easier to analyze small data buckets rather than to deal with
the massive data pile.

Methods used for data analysis in quantitative research

After the data is prepared for analysis, researchers are open to using different research and data
analysis methods to derive meaningful insights. For sure, statistical techniques are most favored
to analyze numerical data. The method is again classified into two groups. First, ‘Descriptive
Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data.
(QuestionPro, 2020)

Descriptive statistics
This method is used to describe the basic features of versatile types of data in research. It
presents the data in such a meaningful way that pattern in the data starts making sense.
Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions
are again based on the hypothesis researchers have formulated so far. Here are a few major types
of descriptive analysis methods

Measures of Frequency
 Count, Percent, Frequency
 It is used to denote home often a particular event occurs
 Researchers use it when they want to showcase how often a response is given
Measures of Central Tendency
 Mean, Median, Mode
 The method is widely used to demonstrate distribution by various points
 Researchers use this method when they want to showcase the most commonly or averagely
indicated response
Measures of Dispersion or Variation
 Range, Variance, Standard deviation
 Here the field equals to high/low points
 Variance standard deviation = difference between the observed score and mean
 It is used to identify the spread of scores by stating intervals
 Researchers use this method to showcase data spread out. It helps them identify the depth
until which the data is spread out that it directly affects the mean.
Measures of Position
 Percentile ranks, Quartile ranks
 It relies on standardized scores helping researchers to identify the relationship between
different scores.
 It is often used when researchers want to compare scores with the average count.

Inferential statistics
Inferential statistics are used to make predictions about a larger population after research and
data analysis of the collected sample of the representing population. For example, at a movie
theater, you can ask some odd 100 audiences if they like the movie they are watching.
Researchers then use inferential statistics on the collected sample to reason that about 80-90% of
people like the movie they are watching. (QuestionPro, 2020)

Here are two significant areas of inferential statistics

 Estimating parameters: it takes statistics from the sample research data and uses it to
demonstrate something about the population parameter.
 Hypothesis test: it’s about sampling research data to answer the survey research questions.
For example, researchers might be interested to understand if the new shade of lipstick
recently launched is good or not, or if the multivitamin capsules help children to perform
better at games.

Here are some of the commonly used methods for data analysis in research
 Correlation: When researchers are not conducting experimental research wherein the
researchers are interested to understand the relationship between two or more variables,
they opt for correlational research methods.
 Cross-tabulation: Also called as contingency tables, cross-tabulation is a method used to
analyze the relationship between multiple variables. Suppose a provided data has age and
gender categories presented in rows and columns, then a two-dimensional cross-tabulation
helps for seamless data analysis and research by showing the number of males and the
number of females in each age category.
 Regression analysis: For understanding the strong relationship between two variables,
researchers do not look beyond the primary and commonly used regression
analysis method, which is also a type of predictive analysis used. In this method, you have
an essential factor called the dependent variable, and you also have multiple independent
variables in regression analysis, you undertake efforts to find out the impact of
independent variables on the dependent variable. The values of both independent and
dependent variables are assumed as being ascertained in an error-free random manner.
 Frequency tables: The statistical procedure is used for testing the degree to which two or
more vary or differ in an experiment. A considerable degree of variation means research
findings were significant. In many contexts, ANOVA testing and variance analysis are
similar.
 Analysis of variance: The statistical procedure is used for testing the degree to which two
or more vary or differ in an experiment. A considerable degree of variation means research
findings were significant. In many contexts, ANOVA testing and variance analysis are
similar.
Considerations in research data analysis

 Researchers must have the necessary skills to analyze the data, Getting trained to
demonstrate a high standard of research practice. Ideally, researchers must possess more
than a basic understanding of the rationale of selecting one statistical method over the
other to obtain better data insights.
 Usually, research and data analytics methods differ by scientific discipline; therefore,
obtaining statistical advice at the beginning of analysis helps in designing a survey
questionnaire, selecting data collection methods, selecting samples.
 The primary aim of data research and analysis is to derive ultimate insights that are
unbiased. Any mistake in or keeping a biased mind to collect data, selecting analysis
method, or in choosing audience sample il to result in drawing a biased inference.
 Irrelevant to the sophistication used in research data and analysis is enough to rectify the
poorly defined objective outcome measurements. It does not matter if the design is at fault
or intentions are not clear, but lack of clarity might mislead readers, therefore avoid the
practice.
 The motive behind data analysis in research is to present accurate and reliable data. As far
as possible, avoid statistical errors, and find a way to deal with everyday challenges like
outliers, missing data, data altering, data mining, or developing graphical representation.

Conclusion: in this above firstly conducting primary and secondary research using appropriate
methods for a computing research project that consider costs, access and ethical issues Applying
appropriate analytical tools.
LO3: Communicate the outcomes of a research project to identified stakeholders.
Introduction: in this below task first of all I am going to introduces research outcomes in an
appropriate manner for the intended audience. After that produces how this research going to
Coherently and logically communicate outcomes to the intended audience demonstrating how
outcomes meet set research objectives.
Natural language processing: Natural language processing (NLP) describes the interaction
between human language and computers. It’s a technology that many people use daily and has
been around for years, but is often taken for granted.
A few examples of NLP that people use every day are:
 Spell check
 Autocomplete
 Voice text messaging
 Spam filters
 Related keywords on search engines
 Siri, Alexa, or Google Assistant
In any case, the computer is able to identify the appropriate word, phrase, or response by using
context clues, the same way that any human would. Conceptually, it’s a fairly straightforward
technology.
Where NLP outperforms humans is in the amount of language and data it’s able to process.
Therefore, its potential uses go beyond the examples above and make possible tasks that
would’ve otherwise taken employees months or years to accomplish. (Tableau Software, 2020)

Why Should Businesses Use Natural Language Processing?


Human interaction is the driving force of most businesses. Whether it’s a brick-and-mortar store
with inventory or a large SaaS brand with hundreds of employees, customers and companies
need to communicate before, during, and after a sale. That means that there are countless
opportunities for NLP to step in and improve how a company operates. This is especially true of
large businesses that want to keep track of, facilitate, and analyze thousands of customer
interactions in order to improve their product or service.
It would be nearly impossible for employees to log and interpret all that data on their own, but
technologies integrated with NLP can help do it all and more. (Tableau Software, 2020)
How Can audience Use Natural language Processing?
There are a wide variety of different applications for NLP. Below are just three different ways
that companies can use the technology in their business. (Tableau Software, 2020)

1) Improve user experience

NLP can be integrated with a website to provide a more user-friendly experience. Features like
spell check, autocomplete, and autocorrect in search bars can make it easier for users to find the
information they’re looking for, which in turn keeps them from navigating away from your site.

2) Automate support

Chatbots are nothing new, but advancements in NLP have increased their usefulness to the point
that live agents no longer need to be the first point of communication for some customers. Some
features of chatbots include being able to help users navigate support articles and knowledge
bases, order products or services, and manage accounts.

3) Monitor and analyze feedback

Between social media, reviews, contact forms, support tickets, and other forms of
communication, customers are constantly leaving feedback about the product or service. NLP
can help aggregate and make sense of all that feedback, turning it into actionable insight that can
help improve the company.

Communicate research outcomes in an appropriate manner for the intended audience.

We don’t regularly think about the intricacies of our own languages. It’s an intuitive behavior
used to convey information and meaning with semantic cues such as words, signs, or images. It’s
been said that language is easier to learn and comes more naturally in adolescence because it’s a
repeatable, trained behavior—much like walking. And language doesn’t follow a strict set of
rules, with so many exceptions like “I before E except after C.”What comes naturally to humans,
however, is exceedingly difficult for computers with the amount of unstructured data, lack of
formal rules, and absence of real-world context or intent. That’s why machine learning and
artificial intelligence (AI) are gaining attention and momentum, with greater human dependency
on computing systems to communicate and perform tasks. And as AI gets more sophisticated, so
will Natural Language Processing (NLP). While the terms AI and NLP might conjure images of
futuristic robots, there are already basic examples of NLP at work in our daily lives. Here are a
few prominent examples. (Tableau Software, 2020)

So, I am going to show the research outcomes for the audience: -

Email filters

Email filters are one of the most basic and initial applications of NLP online. It started out with
spam filters, uncovering certain words or phrases that signal a spam message. But filtering has
upgraded, just like early adaptations of NLP. One of the more prevalent, newer applications of
NLP is found in Gmail's email classification. The system recognizes if emails belong in one of
three categories (primary, social, or promotions) based on their contents. For all Gmail users, this
keeps your inbox to a manageable size with important, relevant emails you wish to review and
respond to quickly. (Tableau Software, 2020)

Smart assistants

Smart assistants like Apple’s Siri and Amazon’s Alexa recognize patterns in speech thanks to
voice recognition, then infer meaning and provide a useful response. We’ve become used to the
fact that we can say “Hey Siri,” ask a question, and she understands what we said and responds
with relevant answers based on context. And we’re getting used to seeing Siri or Alexa pop up
throughout our home and daily life as we have conversations with them through items like the
thermostat, light switches, car, and more. We now expect assistants like Alexa and Siri to
understand contextual clues as they improve our lives and make certain activities easier like
ordering items, and even appreciate when they respond humorously or answer questions about
themselves. Our interactions will grow more personal as these assistants get to know more about
us. As a New York Times article “Why We May Soon Be Living in Alexa’s World,” explained:
“Something bigger is afoot. Alexa has the best shot of becoming the third great consumer
computing platform of this decade.” (Tableau Software, 2020)
Search results

Search engines use NLP to surface relevant results based on similar search behaviors or user
intent so the average person finds what they need without being a search-term wizard. For
example, Google not only predicts what popular searches may apply to your query as you start
typing, but it looks at the whole picture and recognizes what you’re trying to say rather than the
exact search words. Someone could put a flight number in Google and get the flight status, type a
ticker symbol and receive stock information, or a calculator might come up when inputting a
math equation. These are some variations you may see when completing a search as NLP in
search associates the ambiguous query to a relative entity and provides useful results.

Predictive text

Things like autocorrect, autocomplete, and predictive text are so commonplace on our
smartphones that we take them for granted. Autocomplete and predictive text are similar to
search engines in that they predict things to say based on what you type, finishing the word or
suggesting a relevant one. And autocorrect will sometimes even change words so that the overall
message makes more sense. They also learn from you. Predictive text will customize itself to
your personal language quirks the longer you use it. This makes for fun experiments where
individuals will share entire sentences made up entirely of predictive text on their phones. The
results are surprisingly personal and enlightening; they’ve even been highlighted by several
media outlets. (Tableau Software, 2020)

Language translation

One of the tell-tale signs of cheating on your Spanish homework is that grammatically, it’s a
mess. Many languages don’t allow for straight translation and have different orders for sentence
structure, which translation services used to overlook. But, they’ve come a long way. With NLP,
online translators can translate languages more accurately and present grammatically-correct
results. This is infinitely helpful when trying to communicate with someone in another language.
Not only that, but when translating from another language to your own, tools now recognize the
language based on inputted text and translate it. (Tableau Software, 2020)
Digital phone calls

We all hear “this call may be recorded for training purposes,” but rarely do we wonder what that
entails. Turns out, these recordings may be used for training purposes, if a customer is aggrieved,
but most of the time, they go into the database for an NLP system to learn from and improve in
the future. Automated systems direct customer calls to a service representative or online
chatbots, which respond to customer requests with helpful information. This is a NLP practice
that many companies, including large telecommunications providers have put to use. NLP also
enables computer-generated language close to the voice of a human. Phone calls to schedule
appointments like an oil change or haircut can be automated. (Tableau Software, 2020)

Data analysis

Natural language capabilities are being integrated into data analysis workflows as more BI
vendors offer a natural language interface to data visualizations. One example is smarter visual
encodings, offering up the best visualization for the right task based on the semantics of the data.
This opens up more opportunities for people to explore their data using natural language
statements or question fragments made up of several keywords that can be interpreted and
assigned a meaning. Applying language to investigate data not only enhances the level of
accessibility, but lowers the barrier to analytics across organizations, beyond the expected
community of analysts and software developers. (Tableau Software, 2020)

Text analytics

Text analytics converts unstructured text data into meaningful data for analysis using different
linguistic, statistical, and machine learning techniques. While sentiment analysis sounds daunting
to brands--especially if they have a large customer base--a tool using NLP will typically scour
or
customer interactions, such as social media comments reviews, or even brand name mentions
to see what’s being said. Analysis of these interactions can help brands determine how well a
marketing campaign is doing or monitor trending customer issues before they decide how to
respond or enhance service for a better customer experience. Additional ways that NLP helps
with text analytics are keyword extraction and finding structure or patterns in unstructured text
data. (Tableau Software, 2020)
Conclusion: in this above task firstly I introduces research outcomes in an appropriate manner
for the intended audience. After that i produces how this research going to Coherently and
logically communicate outcomes to the intended audience demonstrating how outcomes meet set
research objectives.
LO4: Reflect on the application of research methodologies and concept.
Introduction: before starting the above task firstly I will be reflecting on the effectiveness of
research methods applied for meeting objectives of the business research project Considering
alternative research methodologies and lessons learnt in view of the outcomes.
Reflect on the effectives of research methods applied for meeting objectives of the business
research project.

The relationship between AI and natural language processing

While natural language processing isn’t a new science, the technology is rapidly advancing


thanks to an increased interest in human-to-machine communications, plus an availability of big
data, powerful computing and enhanced algorithms.

As a human, you may speak and write in English, Spanish or Chinese. But a computer’s native
language – known as machine code or machine language – is largely incomprehensible to most
people. At your device’s lowest levels, communication occurs not with words but through
millions of zeros and ones that produce logical actions.

Indeed, programmers used punch cards to communicate with the first computers 70 years ago.
This manual and arduous process was understood by a relatively small number of people. Now
you can say, “Alexa, I like this song,” and a device playing music in your home will lower the
volume and reply, “OK. Rating saved,” in a human-like voice. Then it adapts its algorithm to
play that song – and others like it – the next time you listen to that music station.

Let’s take a closer look at that interaction. Your device activated when it heard you speak,
understood the unspoken intent in the comment, executed an action and provided feedback in a
well-formed English sentence, all in the space of about five seconds. The complete interaction
was made possible by NLP, along with other AI elements such as machine learning and deep
learning. (Bell, 2020)
Effectives of research methods applied for meeting objectives of the business research
project: NLP algorithms teach computers to use language like people. If you were manually
searching for information from a set of documents, you’d skim for keywords too, just like search
engines. That’s why machine translation, the first form of NLP, was modeled on World War II
code breaking techniques. Developers hoped machine translation would translate Russian into
English. Results were horrible, but the coders kept at it and a new type of machine learning was
born. And because a company can’t grow internationally without translation, NLP was a
technology with a business case from the get-go. Today, natural language processing is as
integral to the workplace as communication itself. Following are four NLP business applications
for meeting objectives of emerging today that we won’t think twice about using tomorrow. If
your company is looking to make use of NLP, here are some places to start. (Bell, 2020)

1. Neural machine translation

Machine translation (MT) used to be laughable, but it’s pretty good now. Since natural language
processing software learns language in the way a person does, think of early MT as a toddler.
Over time, more words get added to an engine, and soon there’s a teenager who won’t shut up.
Machine translation quality is inherently dependent on the number of words you give it, which
takes time and originally made MT hard to scale. Fortunately, for businesses that don’t want to
wait for an engine to “grow up,” there’s neural machine translation. In 2016, Microsoft’s Bing
Translator became first to launch the tech. Google Translate and Amazon Translate now offer
competing systems. Before neural, machine translation engines operated in only one direction —
say, Spanish into English. If you wanted to translate from English into Spanish, you had to start
over with a different data set. And if you wanted to add a third language, well, that was crazy.
But with neural machine translation, engineers can cross-apply data. This radically speeds up
development, taking a machine translation engine from zero to amazing in months vs. years. As a
result, businesses can safely use MT to translate low-impact content: product reviews, regulatory
docs that no one reads, email.

One warning, though: Free machine translation tools — neural or not — are a data security risk.
An alleged Translate.com leak resulted in employee passwords, contracts, and other personally
identifiable information (PII) surfacing in Google search results. Machine translation itself is
perfectly safe — if you use the professional kind custom-built by Asia Online, Systran, and
others. Just be careful what you enter in free tools online. (Bell, 2020)

2. Chatbots

If machine translation is one of the oldest natural language processing examples, chatbots are the
newest. Bots streamline functionality by integrating in programs like Slack, Skype, and
Microsoft Teams. When they first came on the scene, chatbots were consumer-facing. For
example, if you typed “pizza” into Facebook Messenger, a Domino’s bot would ask to take your
order. While touch points like these can help drive B2C sales, in a B2B world no one wants
purchasing reminders interrupting them in Slack. So over the past year, startups have applied the
tech to other areas: Most enterprise bots optimize HR. First, there’s Talla, a natural language
processing tool that answers common employee questions like “How much vacation do I have
left?” and “When does my insurance kick in?” Chatbot Polly polls employees on everything
from workplace satisfaction to what snacks they want in the breakroom. Then there’s Growbot, a
Slack and Teams bot that monitors chat to see how often employees compliment one another.
When words like “kudos,” “cheers,” and “props” are used, staff get rewarded. Cofounder and
CEO Jeremy Vandehey says this helps managers improve retention and morale. (Bell, 2020)

3. Hiring tools

On the topic of HR, natural language processing software has long helped hiring managers sort
through resumes. Using the same techniques as Google search, automated candidate sourcing
tools scan applicant CVs to pinpoint people with the required background for a job. But — like
early machine translation — the sorting algorithms these platforms used made a lot of mistakes.
Say an applicant called herself a “business growth brainstormer” instead of an “outside sales
rep”: Her resume wouldn’t show in results and your company would overlook a creative, client-
driven candidate. Today’s systems move beyond exact keyword match. Scout, for example,
addresses the synonym issue by searching for HR’s originally provided keywords, then using
results to identify new words to look for. Extrapolating new terms (like “business growth”) keeps
qualified candidates from slipping between the cracks. And since women and minorities use
language differently, the process makes sure they don’t either.
diverse candidates can’t be considered if they don’t apply. To address that problem, there’s
Textio. Cofounder and CEO Kieran Snyder says the augmented writing tool uses semantic
categorization, a natural language processing technique, to help recruiters craft gender-neutral
job descriptions. Scoring posts on a scale of zero to 100 as you write, Textio provides
vocabulary, syntax, and formatting tips like “add more bullet points.” Implement these changes
and client case studies suggest you’ll see a radical improvement in applicant numbers: Snyder
says Johnson & Johnson experienced a 9 percent increase in female applicants, Avery Dennison
saw a 60 percent increase, and “Expedia found that jobs that were gender-neutral filled nearly
three weeks faster.” (Bell, 2020)

4. Conversational search

Like Talla, Second Mind wants to answer all your employees’ questions. But this tool isn’t a bot:
It’s a voice-activated platform that listens in on company meetings for trigger phrases like “what
are” and “I wonder.” When it hears them, Second Mind’s search function whirs into action,
seeking an answer for the rest of your sentence.

Say, for example, you’re in a board meeting and someone says, "What was the ROI on that last
year?" Silently, Second Mind would scan company financials — or whatever else they asked
about — then display results on a screen in the room. Founder Kul Singh says the average
employee spends 30 percent of the day searching for information, costing companies up to
$14,209 per person per year. By streamlining search in real-time conversation, Second Mind
promises to improve productivity. (Bell, 2020)

Advantages of Secondary Researsh


1. Secondary Research is a less expensive and less time-consuming process as data required is
easily available and doesn’t cost much.

2. Most information in secondary research is already available. There are many sources from
which relevant data can be collected and used, unlike primary research, where data needs to
collect from scratch.
3.With the help of secondary research, Organizations can form a hypothesis and evaluate cost of
conducting primary research.

4. Secondary research is quicker to conduct because of availability of data. Secondary research


can be completed within a few weeks depending on the objective of businesses or scale of data
needed.

For me, Secondary research is perfect for me because my country is underdeveloped country and
there is no any drone company to do one to one interview, case-study, questionnaire and surveys.
(Bell, 2020)

Consider alternative research methodologies and lessons learnt in view of the outcomes.
Alternative research methodologies and lessons learnt in view of the outcomes:
Hyponymy shows how a specific instance is related to a general term (i.e. a cat is a mammal) and
meronymy denotes that one term is a part of another (i.e. a cat has a tail). Such relationships
must be understood to perform the task of textual entailment, recognizing when one sentence is
logically entailed in another. Now that We’re more enlightened about the myriad challenges of
language, let’s return to Liang’s four categories of alternative methodology for metting the
outcomes of natural language processing.

1. DISTRIBUTIONAL methodology:

Distributional methodology include the large-scale statistical tactics of machine learning and
deep learning. These methods typically turn content into word vectors for mathematical analysis
and perform quite well at tasks such as part-of-speech tagging (is this a noun or a verb?),
dependency parsing (does this part of a sentence modify another part?), and semantic relatedness
(are these different words used in similar ways?). These NLP tasks don’t rely on understanding
the meaning of words, but rather on the relationship between words themselves. Such systems
are broad, flexible, and scalable. They can be applied widely to different types of text without the
need for hand-engineered features or expert-encoded domain knowledge. The downside is that
they lack true understanding of real-world semantics and pragmatics. Comparing words to other
words, or words to sentences, or sentences to sentences can all result in different outcomes.
Semantic similarity, for example, does not mean synonymy. A nearest neighbor calculation may
even deem antonyms as related:

Advanced modern neural network models, such as the end-to-end attentional memory


networks pioneered by Facebook or the joint multi-task model invented by Salesforce can handle
simple question and answering tasks, but are still in early pilot stages for consumer and
enterprise use cases. Thus far, Facebook has only publicly shown that a neural network trained
on an absurdly simplified version of The Lord of The Rings can figure out where the elusive One
Ring is located.
 
 

Although distributional methods achieve breadth, they cannot handle depth. Complex and
nuanced questions that rely linguistic sophistication and contextual world knowledge have yet to
be answered satisfactorily. (Topbots.com, 2020)

2. FRAME-BASED methodology:

“A frame is a data-structure for representing a stereotyped situation,” explains Marvin Minsky in


his seminal 1974 paper called “A Framework For Representing Knowledge.” Think of frames as
a canonical representation for which specifics can be interchanged.
Liang provides the example of a commercial transaction as a frame. In such situations, you
typically have a seller, a buyers, goods being exchanged, and an exchange price.
 

Sentences that are syntactically different but semantically identical – such as “Cynthia sold Bob
the bike for $200” and “Bob bought the bike for $200 from Cynthia” – can be fit into the same
frame. Parsing then entails first identifying the frame being used, then populating the specific
frame parameters – i.e. Cynthia, $200. The obvious downside of frames is that they require
supervision. In some domains, an expert must create them, which limits the scope of frame-based
approaches. Frames are also necessarily incomplete. Sentences such as “Cynthia visited the bike
shop yesterday” and “Cynthia bought the cheapest bike” cannot be adequately analyzed with the
frame we defined above. (Topbots.com, 2020)

3. MODEL-THEORETICAL methodology:

Model theory refers to the idea that sentences refer to the world, as in the case with grounded
language (i.e. the block is blue). In compositionality, meanings of the parts of a sentence can be
combined to deduce the whole meaning. Liang compares this approach to turning language into
computer programs. To determine the answer to the query “what is the largest city in Europe by
population”, you first have to identify the concepts of “city” and “Europe” and funnel down your
search space to cities contained in Europe. Then you would need to sort the population numbers
for each city you’ve shortlisted so far and return the maximum of this value.

To execute the sentence “Remind me to buy milk after my last meeting on Monday” requires
similar composition breakdown and recombination.

 
 

Models vary from needing heavy-handed supervision by experts to light supervision from
average humans on Mechanical Turk. The advantages of model-based methods include full-
world representation, rich semantics, and end-to-end processing, which enable such approaches
to answer difficult and nuanced search queries. The major con is that the applications are heavily
limited in scope due to the need for hand-engineered features. Applications of model-theoretic
approaches to NLU generally start from the easiest, most contained use cases and advance from
there. (Topbots.com, 2020)

4. INTERACTIVE LEARNING

Paul Grice, a British philosopher of language, described language as a cooperative game between
speaker and listener. Liang is inclined to agree. He believes that a viable approach to tackling
both breadth and depth in language learning is to employ dynamic, interactive environments
where humans teach computers gradually. In such approaches, the pragmatic needs of language
inform the development. To test this theory, Liang developed SHRDLRN as a modern-day
version of Winograd’s SHRDLU. In this interactive language game, a human must instruct a
computer to move blocks from a starting orientation to an end orientation. The challenge is that
the computer starts with no concept of language. Step by step, the human says a sentence and
then visually indicates to the computer what the result of the execution should look like.
(Topbots.com, 2020)

If a human plays well, he or she adopts consistent language that enables the computer to rapidly
build a model of the game environment and map words to colors or positions. The surprising
result is that any language will do, even individually invented shorthand notation, as long as you
are consistent.
 

The worst players who take the longest to train the computer often employ inconsistent
terminology or illogical steps.

Conclusion: In this above task firstly reflect on the effectiveness of research methods applied for
meeting objectives of the business research project Considering alternative research
methodologies and lessons learnt in view of the outcomes. After that I provide the evidences and
further documentation of critical reflection and insight that result in recommended actions for
improvements and future research considerations.
References:
1. Us.sagepub.com. (2019). [online] Available at:
https://us.sagepub.com/sites/default/files/upm-
binaries/55588_Chapter_1_Sample_Creswell_Research_Design_4e.pdf [Accessed 27
Dec. 2019].
2. Mayk, M. (2019). Let’s Talk Price: How Much Does Research Cost? - Market
Connections. [online] Market Connections. Available at:
https://www.marketconnectionsinc.com/lets-talk-price-how-much-does-research-cost/
[Accessed 27 Dec. 2019].
3. Courses.lumenlearning.com. (2020). Forms of Primary Research | English 112:
Exposition and Persuasion. [online] Available at:
https://courses.lumenlearning.com/ivytech-engl112/chapter/forms-of-primary-research/
[Accessed 8 Jan. 2020].
4. Anon, (2020). [online] Available at: https://www.nhcc.edu/student-
resources/library/doinglibraryresearch/basic-steps-in-the-research-process [Accessed 8
Jan. 2020].
5. Smartsheet. (2020). Ultimate Guide to Project Cost Estimating | Smartsheet. [online]
Available at: https://www.smartsheet.com/ultimate-guide-project-cost-estimating
[Accessed 8 Jan. 2020].
6. Aclweb.org. (2020). [online] Available at: https://www.aclweb.org/anthology/W17-
1613.pdf [Accessed 8 Jan. 2020].
7. QuestionPro. (2020). Data analysis in research: Why data, types of data, data analysis in
qualitative and quantitative research | QuestionPro. [online] Available at:
https://www.questionpro.com/blog/data-analysis-in-research/ [Accessed 8 Jan. 2020].
8. Tableau Software. (2020). 8 common examples of natural language processing and their
impact on communication. [online] Available at:
https://www.tableau.com/learn/articles/natural-language-processing-examples [Accessed
8 Jan. 2020].
9. Bell, T. (2020). 4 business applications for natural language processing. [online] CIO.
Available at: https://www.cio.com/article/3241814/4-business-applications-for-natural-
language-processing.html [Accessed 8 Jan. 2020].
10. Topbots.com. (2020). 4 Approaches To Natural Language Processing & Understanding |
TOPBOTS. [online] Available at: https://www.topbots.com/4-different-approaches-
natural-language-processing-understanding/ [Accessed 8 Jan. 2020].

You might also like