Online reviews and ratings
The impact on individual decision-making and product demand
An experiment is conducted to test the proposed hypotheses is discussed below. Methodology chapter discusses the design, sample and procedure taken when conducting this research. In this study researcher focused particularly on Millenial generation group and used novel research method in relation to website analytics. In order to understand the sample and technicalities associated with this research, the terms are discussed bellow. Lastly, the chapter covers ethical considerations that were taken into an account during this study.
4.1. Research Design
Discussed research includes mixed method approach. Qualitative research involves secondary data that is examined in the literature review. Quantitative research investigates primary data that was collected through an experiment and survey. Researcher used deductive research design, drawing hypothesis from existing theory. It is also a confirmatory study that aims to confirm studies conducted by a number of past researchers.
Themes were built from the top down approach. Researcher began with thinking up a theory about the topic of interest and narrowed it down into more specific hypothesis that could test the issue. This study used an experiment method. It was designed to examine how consumers process online reviews and what impact does it have on sales. Subjects were involved into A / B test, which was designed to have or not to have eWOM. Additionally, participants were asked to evaluate its importance when making a purchase decision. This was asked through online survey that also collected confirmatory questions and demographic information.
4.1.1. Online consumer reviews and ratings
Google Analytics technology was used to determine the impact of online reviews. The subjects were asked to surf eCommerce website as usual while Google Analytics script was collecting average time spent on the page, numbers of decisions participants made when choosing an event and the choice if they decided to buy a ticket.
Before the main experiment, researcher conducted a pre-test to determine the quality of online reviews. On the basis of real online reviews and ratings from eCommerce website YPlan (yplanapp.com) researcher included eWOM in B experimental website. Each eWOM included a rating, number of people attending an event and the number of reviews submitted (Appendix 2). The amount of information of eWOM was set like this to reduce the amount of effort one has to engage in information processing (Petty et al. 1983).
One experimental website (A test) did not include customer reviews and ratings (control variable) and the other one (B test) did (dependent variable). Before the main experiment, researcher conduced a trial experiment with a Film category and observed if there were changes in consumer behavior when exposed to differential websites. The trial experiment included 53 participants. Based on the results, website design and research procedure were revised. To check quality manipulation, researcher asked participants to confirm their selection in the online survey. Appendix 2 shows example of revised website design.
4.1.2. The experimental product and eCommerce website
According to Mangold and Smith (2011: p. 146) company website is one of the most frequently used online venues for voicing opinions. Thus, eCommerce website prototype was created for this study (6u2tbt.axshare.com). This was done using a website prototype software Axure RP Pro 7.0. Website’s site map is in Figure 1.
Website prototype was based on YPlan’s website (a ticketing platform where customers purchase events online) having a similar layout and functions to other international ticketing websites such as TicketMaster.com where online users contributed a large number of reviews to evaluate a wide range of products. On YPlanApp.com, online reviews are mainly about Film, Performance and Nightlife events. Each event has a unique page where review number and ranking, as well as number of people attending the event are listed on the side of the basic information of the event. YPlan offer numeric information to show the number of reviews for each event and use star ranking and attendees participation to demonstrate the different contributions. Thus YPlanApp.com appears to be a suitable research site because it enables subjects to navigate through the website and observe online reviews heuristics (i.e. perceived quantity of arguments and average rating).
A suitable product for testing hypothesis with the subject was chosen based on three criteria. First, researcher aimed to choose a product that appealed to young audiences such as nightlife events. It was also easy to understand and affordable for students and young professionals. Second, researcher reduced the risk of product familiarity exposing to products that they had no previous knowledge on. Third, product description included appealing images and an easy-to-read content that triggered engagement.
Three types of events were chosen for this study namely ‘Friday Night Speed Dating in the City’ (£21), followed by ‘Playboy Club Presents Valentine's 2016’ (£50) and ‘Vault Festival 2016 Lates: B Movie Ball’ (£10) (Figure 2). For event tracking purposes when designing website pages, these were coded by ‘female’, ‘male’ and ‘group’ events accordingly.
Furthermore, researcher implemented A / B test experimental design. A / B testing refers to testing two different versions of a page or specific element such as presence of online reviews (Chaffey and Ellis-Chadwick, 2012). The alternatives are randomly split between two pages and changes in visitor behaviour can then be compared using different metrics such as click-through rate on pages, average time spent on the pages, or macro-conversion rates such as conversions to sale (Chaffey and Ellis-Chadwick, 2012). The aims are to increase page effectiveness including conversion rates and revenue per visit (Chaffey and Ellis-Chadwick, 2012).
When completing A / B testing it is important to identify a realistic control page to compare against. The new alternative is then compared to baseline. (Chaffey and Ellis-Chadwick, 2012). For example: a control page is a landing page without online reviews and ratings and an alternative version is designed to have those additional attributes.
In the A (test) experimental website the events did not have online reviews and ratings on the description page (Figure 3). In contrast, B (test) experimental website had three types of eWOM: number of people attending an event, number of reviews submitted and ‘star’ rating (Figure 4).
In the B (test) site, eWOM was assigned randomly. The highest perceived value was given to ‘Vault Festival 2016 Lates: B Movie Ball’, followed by ‘Friday Night Speed Dating in the City’ and ‘Playboy Club Presents Valentine's 2016’. Assigned eWOM is summarised in Table 1.
Researcher replicated product information page of real eCommerce website YPlan (event ticketing platform). The page included an image and brief explanation of event’s purpose. Page was modified adding some function and eliminating particular attributes on the basis of real information. Online customer reviews and ratings were located on the right of product’s description, under purchase buttons, as shown in Figure 3.
4.1.3. Tracking and data collection
Web analytics involve collecting, measuring, monitoring, analysing and reporting web usage data to understand consumer experience online (Hasan et. al., 2009: p. 698). It provides simple statistics such as number of visitors, average number of page views per visitor and average duration on the page (Plaza, 2011: p. 477). The technology can help to optimise websites in order to accomplish business goals, as well as improve customer satisfaction and increase loyalty (Hasan et. al., 2009: p. 698).
In order to use Google Analytics software, it is necessary to install the required script on the company’s website (Hasan et. al., 2009: p. 698). Tracking application records traffic by inserting a small piece of HTML code intro every page of the website (Plaza, 2011: p. 477). The data is collected when pages load in the browser. (Hasan et. al., 2009: p. 698). Google Analytics became an important research method for a number of reasons. First, it provides time series data (Plaza, 2011: p. 477). Furthermore, Google Analytics is a free service, as well as user-friendly application with a guarantee of high quality technology from Google (Plaza, 2011: p. 477).
Analytics tells web owner how visitors interact with the page and provides insights how to improve the site content and design to be more attractive to the user (Plaza, 2011: p. 477). For instance, Google Analytics reveals information such as the fact that return visits spend more time on the website and less bounce rate results in greater duration (Plaza, 2011: p. 481).
For the purpose of tracking participant’s activity on the prototype website, a script with small piece of HTML code was inserted in both (A and B) types of experimental website. The Google Analytics traffic overview (Figure 5) shows that total web-links were sent to 97 users from 2 February 2016 to 28 February 2016. 62 of those visits completed the survey and thus were accounted as eligible respondents.
The amount of false clicks or bounce entries is the number of times visitors immediately exit the web site from entrance page. In order to avoid confusion and misinterpretation of data, researcher used UTM track links. An individual link was customized to every respondent so the data would not be lost in the process. This was done assigning ‘Campaign’ parameter to person’s name. For example:
http://6u2tbt.axshare.com/b_start.html utm_source=Facebook&utm_medium=Test&ut m_term=B&utm_campaign=Agniete.
Google URL builder helps to add parameters to URLs in customer web-based campaigns (Google Support, 2016). When users click one of the links, the unique parameter is sent to Google Analytics account so it can identify the URLs that is most effective in attracting users to the website (Google Support, 2016).
There are number of parameters to build a unique track link. Campaign Source is used to identify a search engine (e.g. Facebook). Campaign Medium serves as a medium such as Test. Campaign Keyword may be used for A / B testing to differentiate (e.g. A or B). Campaign Name is required for keyword analysis to identify a specific campaign. For instance, it could be a person’s name (e.g. Agniete) (Google Support, 2016). Figure 6 shows an example of UTM link used for this study.
Data was collected using Google Analytics segmentation technology (Figure 7). Researcher filtered parameters to collect individual respondent’s data such as time spent on the page and number of decisions made before purchase / no purchase (with eWOM and without eWOM). This information is reached when following these steps on Google Analytics: Behaviour > Site Content > All Pages. An example of customized data collection is summarized in Figure 8.
Individual data was collected from Google Analytics and transferred to MS Excel form. It was then examined using statistical software GraphPad. T statistical test was applied to analyze significance of the results.
To capture demographic information and better understand the impact of online reviews and ratings, researcher has also created an online survey. After visiting the prototype website, participants were immediately asked to answer a short survey. The survey was created using short questions and implemented in Google Forms online.
This survey was made up of differing individual demographics, online reviews and ratings importance scale and confirmatory questions such as what type of event they chose and if they decided to buy the ticket.
Online survey used after digital experiment assessed individuals personal involvement level then using online reviews and ratings. Zaichkowky’s (1994) revised personal involvement scale included cognitive dimension measure namely importance of eWOM when making purchase decisions. A revised scale reduced prior original scale from 20 to 5 items (Zaichkowky, 1994). Participants were asked to respond on a scale from 0 to 5, where 0 equals ‘not at all important’ and 5 equals ‘very important’. This was then transferred to MS Excel and analysed with Graph Pad statistical software.
Being one of the most digitally literate populations, Millenials are the most attractive targets for consumer-orientated firms (Mangold & Smith, 2012: p. 142). Having been born between 1981 and 1997, they are biggest generation group since the baby boomers (Mangold & Smith, 2012: p. 142).
There are several reasons why Millennials are an important market to target. First, they are familiar with and shape digital technologies. Having grown up in making purchases online savvy millennials became a vital component in evolution of social media (Mangold & Smith, 2012: p. 141). Using digital media on daily basis, including computers and mobile devices, they became a driving source of online communications (Mangold & Smith, 2012: p. 141). Second, this generation is especially important because they are highly motivated to seek and share information provided by their peers or other customers (Mangold & Smith, 2012: p. 142). This means they shape each other’s behaviour.
New venues of communication give Millennials power over control of information (Mangold & Smith, 2012: p. 143). Consumer now has an ability to redefine companies to their standards by purchasing from those who share similar values ((Mangold & Smith, 2012: p. 143). Participants can now have voice in the design of products and shopping experience thus marketers should aim to gain insights regarding the needs and preferences of their consumers (Mangold & Smith, 2012: p. 144).
The sample consisted of 62 people; undergraduate students and working professionals. The sample had an age range between 18 and 30; an average age of 22; a median age of 21 and variance of 6.48. Students and working professionals were targeted due to the likelihood to be familiar with digital content and be able to use and participate in the online environment. The sample was asked to enter an eCommerce website and make a purchase decision on selected event.
Participants were invited to respond individually using Facebook instant messaging tool. Individuals responded voluntarily via Internet and were sent an information sheet at the beginning to the study (Appendix 1). Individuals were able to participate from any location of their convenience due to the technology chosen for this research project. This study took place between February 02, 2016 and February 28, 2016.
The sample was not observed when participating in the experiment. That ensured respondents did not feel authority pressure where participating in the study. Participants were also granted full confidentiality when responding to the study. The experiment via Google Analytics technology took on average 1 minute and 44 seconds. During data analysis, 35 entries had to be excluded. This is because the respondents did not complete the survey (at the end of the experiment) thus was granted as an unsuccessful participation. This gives a final sample size of 62 participants.
4.3. Research Procedures
Participants were invited to take part in the study through the largest social network site Facebook, receiving an individual message from the researcher. In order to stabilise setting in regards to the website, participants were asked to complete the study on computer rather than mobile screen. Any person interested in participating in the study responded and clicked on the personalised track link. These persons were sent to a landing page with an information sheet to read through and if provided informed consent form (Appendix 1). Entrance to further website pages was granted after consent in participation. The participants could also choose time convenient to them as well as location because the study took place on the Internet.
Once respondents signed a consent form (Appendix 1), they were asked to browse website as usual. Participants were only made aware that this was a study about online content but not experiment specifically analysing consumer behaviour in relation to online reviews and ratings.
Participants were given a brief outline of the situation: that they were planning a night out in February and looking to book a certain event. In doing background research, they were exposed to experimental website prototypes. They were told to browse as usual, navigate to different pages and click on purchase / non-purchase buttons. After they clicked on their purchase decision, they were sent to online survey that captured demographic information, involvement scale and confirmatory questions.
As participants were left alone and allowed to complete the experiment and survey at their convenience, the researcher did not influence or nudged them into making certain decisions and outside influences were minimised. This allowed a more natural research environment to evaluate eCommerce website as they would at home.
The data was recorded and saved on Google Analytics and Google Forms accounts. Collected information was saved under participant names, the data was downloaded to MS Excel and analysed using Graph Pad statistical software. Once both the experiment and survey were completed and saved, the participant was thanked very much for their time.
The process was the same for all participants; the same order of events and they were all told the same instructions. Additionally, all respondents were given the same content to observe and answer. The only deception was that participants were not made aware that the study was concerned specially on online reviews and ratings (as half of the participants did not see them). They were instead told to look into online content generally. Specific software was required to ensure data was collected accurately. Asking participants to complete the experiment and study on computer screen ensured controlled setting for all respondents.
4.4. Ethical Considerations
Experiment and survey respondents received an information sheet (Appendix 1), consent form (Appendix 1) to state their voluntary participation in the study.
Ethical considerations were minor but nevertheless important. The only potential risk to participants involved was sharing their identity when reading site content (average time spent on the page and number of decisions made before purchase). To eliminate this risk, participants were granted full confidentiality before entering prototype website.
Online reviews and eCommerce website was applied and modified from real YPlan website by the researcher. The content was written based on real eCommerce website to reflect what real event descriptions are written like, but fictitious and recreated to control any potentially harmful or upsetting content (Lockie, 2015). Furthermore, participants were aware that they were allowed to stop the experiment at any time with no effect to themselves. If this occasion happened, the data was erased (Lockie, 2015).