In this course, students have learned how to write a proper literature review. This is a necessary skill for anyone who wants to pursue an academic career after their graduation but can be just as useful during undergraduate studies. In just seven weeks, all the steps of the writing process are covered; from formulating a good main question and finding literature, to the actual writing, re-writing, and final editing. Subjects such as how to deal with literature, how to write in an academic style, how to build a comprehensive argumentation, and how to use other students’ comments to improve your own work were discussed.
The index of refraction is an important property of transparent materials, and has always been measured positive for every known material found in nature. However, a negative refractive index can be achieved by metamaterials, which are artificially created materials, consisting of so-called unit cells. In this review article, the applications of negative refractive index metamaterials of different research papers are discussed and compared. Furthermore, the theory behind negative refractive index metamaterials will also be discussed.
From the comparison between the research papers, it was concluded that the metamaterial fabricated by Suzuki et al. (Opt. Express 26, 8314, 2018) can be used in monochromatic optical systems for the low terahertz frequency range, the metamaterial fabricated by Islam et al. (Mater. Technol. 50, 873, 2016) for applications in radio communication in the gigahertz frequency range, and the simulated metamaterial from Wi et al. (Opt. Commun. 412, 85, 2018) in almost all applications in the low terahertz range, assuming that the fabricated metamaterial will match the simulated one.
In quantum computing the communication between two systems using a high fidelity entangled pair of qubits is crucial. To improve the fidelity between two entangled qubits a technique called distillation is used. There are many types of distillation protocols. In 1995 Bennet et al. published a paper  in which a protocol called BBPSSW was presented. This protocol was further improved upon by a new paper published in 1996 by Deutsch et al. in which a new protocol called DEJMPS was presented. Finally, in 2008, a paper published by Campbell and Benjamin presented a final new way of distilling entangled qubit pairs using photon loss. In order to compare each protocol with another an explanation will be given of each protocol.
Then a comparison will be made between the three protocols comparing the protocols by four different points. These are the input flexibility1 , probability of successful distillation, fidelity improvement per iteration and efficiency2 . After comparison the conclusion of DEJMPS being the overall best protocol can be drawn, due to it ranking overall the highest between the four points stated above.
Previous studies have shown that the perception of a recommender system can have varied effects on the measurable user experience. Research has shown that the transparency of a system can improve a person’s trust in the system and the expected usefulness, and have a potential effect on usability of a system. However, transparency has also been reported to have negative effects on users’ perception of the system.
In this literature review, how transparency in recommender systems affects and influences user experience (UX) in terms of usability, trust and user expectations is evaluated. The findings conclude there is a causal relationship between the transparency of a recommender system and its usability and user trust. However, user expectation is seen as an additional result of certain methods, in particular social influence, that could be utilized to achieve transparency.
Interventionist approaches to causation have enjoyed popularity within the scientific community, being regarded as the standard of causal inference. This methodological success has not translated well into consensus regarding the metaphysical question of causation, namely What is causality? Both manipulationists and non-manipulationists have raised criticism of the interventionist view, arguing that it is unacceptably anthropocentric, circular, methodologically fallible, or that another theory is more suitable to answer the metaphysical question, such as the Agency Theory or Causal Pluralism.
In this literature review I aim to survey the main points raised against the interventionist theory of causation, as it was formulated by Woodward (2003), with a focus on the methodological and metaphysical criticisms
The field of search algorithms has changed drastically in the past few decades. Modern search algorithms are solving completely different problems than just twenty years ago. This paper analyzes the factors that caused this shift and determines which innovations could affect the field in the future. To achieve this, some of the best-known search algorithms and techniques are compared and analyzed. The innovations in other industries have had a large impact on search algorithms by introducing new problems and expanding the scope of the field.
Most notably, modern search engines have brought attention to the problem of optimally sorting search results and have inspired to apply innovations from various other fields like AI and big data. In addition, the future of search algorithms looks very encouraging, as artificial intelligence and quantum computing promise to improve the field even further. Therefore, we can conclude that instead of being caused by the exhaustion of classical methods, the evolution of search algorithms has originated from advancements in various other industries
In the age of the Internet of Things (IoT) emerging, distributed systems are becoming more and more mainstream and relevant, and so are novel programming languages. Traditionally distributed systems, like anything else, have been built from the ground up, tailored for their intended application. However, anything distributed is complex by nature, in stark contrast to what some novel programming languages aim to simplify. By means of a literature review, comparing the languages EdgeC, Distributed Oz, and OpenABL, insight is gathered about whether novel programming languages, intended for distributed applications, are actually meaningful in the relation to IoT-scale distributed systems, answering the main question “reliable execution of programs for distributed systems: are programming languages tailored for distributed applications beneficial in development of such programs?”.
Primarily Distributed Oz, but also EdgeC stand out in their capabilities. OpenABL on the other hand appears to be less flexible, due to its niche-limited agent-based model. Overall however, these new programming languages appear promising.