User Tools

Site Tools


public:sc-t-701-rem4-18-1:rem4-18-approvedtitles

<-BACK to REM4-18 MAIN


Approved Topics With Tentative Titles






Comparison Of Different Methods For Counting Fish

Unnar Freyr Erlendsson

When fish are being sold from onshore fish farms to offshore fish farms they need to be counted as the seller will want to get maximum value for their fish and the buyer doesn't want to pay for more than he is getting. There are of course other variables one must consider such as how well each method treats the fish and how quickly they can pump the fish from the fish farm onto a buyers ship. Some methods already exist such as de-water counters where the fish needs to be de-watered before being sent through it but it can be slow and de-watering the fish could be harmful for them. We look at other methods such as full water counters where the fish are sent through a counter without de-watering them first and also RFID chipping fish so one only needs to count the number of RFID's that pass through instead of relying on image recognition and machine learning algorithms. The methods were compared using accuracy of the count, how harmful it is for the fish and the speed of the transaction between the buyer and seller, using this we found that RFID chipping is far superior to other methods as it achieves near 100% accuracy while not being too invasive for the fish and having the fastest transaction time.





Comparison of solving problems coding at night versus day

Theódór Ágúst Magnússon

Programmers have been known to finish projects overnight, fueled with coffee and energy drinks, to meet a certain deadline. I ask myself, whether or not the fact that they are working overnight implies better, worse or same results as they would by working during the day. Knowing this could greatly benefit software companies by giving them a broader set of facts to calculate the feasibility in pursuing a certain deadline, where the dev team would have to program overnight.

We'd perform experiments using an online coding challenge software, which measures the quality of your solution relative to its creation time.

Hypothesis: When a programmer solves problems during night time, the solution is quicker to build but is more prone to testing failures.





Predicting Cryptocurrency Exchange Rates via Machine Learning Methods

Brynjar Sigurðsson

In the paper Automated Bitcoin Trading via Machine Learning Algorithms the authors model the price prediction problem as “a binomial classification task, experimenting with a custom algorithm that leverages both random forests and generalized linear models”.

In prior work it is mentioned that “Bayesian regression to Bitcoin price prediction” has been used. However in the paper Financial time series forecasting with machine learning techniques: A survey it is stated that “Artificial Neural Networks (ANNs) are identified to be the dominant machine learning technique in this area.”

I want to try to get a better prediction rate via Deep Neural Network.

Tentative hypothesis: Out of the three ML methods ANN, SVM and RF, the RF achieves the best precision and SVM the worst precision when used to predict cryptocurrency exchange rates for the future.





The effects of psychedelic substances on the five major dimensions of personality and their implications on general outlook, values and beliefs

Hákon Gunnarsson

Past work done with psychedelic substances in controlled environments has demonstrated a significant effect on the personality trait of openness. Less work has been done to ferret out their effects on the other four dominant traits of personality. The present study hopes to account for these effects, among other things.

Motivation: Psychedelics have been found to influence the results of psychometric tests in a statistically significant manner. The question remains whether or not these changes are superficial or if they are integrated fully enough to be expressed in a change in outlook, actions or other measurable qualities. If they are more than superficial, what is their relationship to the derived changes?

Another aim of the work herein is too detail whether and to what extent any possible effects on personality traits will propagate towards other aspects of consciousness, such as political views and general outlook(i.e are you a positive person or a negative one? Do you have an internal or external locus of control? Are you compassionate and caring or callous and ruthless?)*

Hypothesis: Ingestion of psychedelic substances is capable of creating significant and lasting transformations of personality and these effects will be integrated enough with personality as a whole to effect changes in general outlook and beliefs.





TBD

Pétur Kristófer Oddsson

Sharing processing power with more positive effects instead of advertisement. Advertisments in applications that fill up your window space on your computer or mobile phone ( http://www.tandfonline.com/doi/abs/10.1080/10864415.2004.11044301 ) have been proven to have an negative effect on users who are using these programs. What if these applications could remove these advertisements and use mining under the hood instead ? Would old applications still be used today or would we start using new applications if users would option to exchange advertisements for sharing their processing power instead? The money being spent on advertisement in applications is surpassing $100 billions. If a universal API (Application Programming Interface) for each platform (iOS, Android, Windows) could let users turn off the advertisements. Users would be more likely to use their applications if there were no advertisements but instead a small percentage of processing power is being used. I think developers could earn coins or share the processing power as a worldwide supercomputer to solve tasks that require a lot of processing power and get payment out of that instead of advertisements. Users would be more likely to use these applications more frequently with much more positive effects if they would mine under the hood instead of watching advertisements.





The different effects of display settings on human eyes

Arnar Bjarni Arnarson

Experiment: Users play a computer game for 4 hours (same game for all users) on an LED monitor with different refresh rates (60hz, 120hz, 144hz). The content on the monitor is then displayed with different framerates (24fps, 30fps, 60fps, 120 fps, 144fps). Tests are performed on separate days so the users are fully rested between each trial. The users are questioned if they are experiencing the symptoms of CVS and split into groups based on what frame rate they are used to (usually the refresh rate of their monitor although there may be special cases.)

Does higher frame rate reduce strain on the eyes?

Does higher refresh rate reduce strain on the eyes?

Does it matter what frame rate or refresh rate we are used to?

Hypothesis: Both refresh rate and frame rate have negligible effects but our usual settings will make us feel more comfortable.

Problems: Gameplay differs for users (different lighting in areas), emotional impact from playing the game, users might be sick.





How does hype influence the price of cryptocurrencies

Michelangelo Diamanti

In this comparative experiment I will try to understand the magnitude of the influence of hype on the price of cryptocurrencies:

  1. Create two cryptocurrencies which address two different fields, so the price of one does not affect the other. Each crypto will start with the same price, volume and market cap.
  2. Organize a closed beta testing in which each of the two cryptos is introduced to a group of people, so C1 is assigned to G1 and C2 is assigned to G2. Each group will only receive information and be able to trade its own cryptocurrency.
  3. During the beta we will be in charge of the marketing campaign of the two cryptos, so we will be the one creating and spreading rumors and news, which are known to produce hype.
  4. We will then be able to keep track of exactly when a particular piece of information reaches the desired group of people, and how it affects the price of the crypto
  5. We might also let participants spread some rumors about the crypto they are trading in order to see if there is a difference between hierarchical generated hype and peer to peer hype

At the end of the beta period we will compare the growth of the two currencies and prove, or disprove the theory according which hype almost completely influences the price of a cryptocurrency





Improving usability of medical imaging viewer for virtual reality

Tomáš Michalík

I have found an interesting paper http://ieeexplore.ieee.org/abstract/document/7892382/ (Links to an external site.)Links to an external site. on which I would like to build a topic for this writing.

In the paper mentioned above, there is described a method / a type of visualization of a volumetric data (in this case MRI). It displays a part of the data in a decomposed form (single voxels are displayed with gaps in between). Because of its computational expensive rendering there must be displayed only a data subset.

A possible improvement is based on using Leap Motion, hands tracking technology, instead of HTC Vive 3D controllers. My hypothesis is that using hand itself, without any controller in it, will enable user to interact with the environment with higher precision.

Why this may have a significant impact: increasing precision allows us to display the data smaller (with higher precision user still will be able to do the same task – e.g. selecting particular voxels) → the smaller the data on the screen the higher frame rate is possible (the most computationally expensive part is only rendering volumetric data not the scene around). This could be used for increasing frame rate or to display larger subset of the data with the same frame rate.





Comparing the Burrow-Wheeler Aligner (BWA) and the Bowtie 2 software for short read alingment

Antton Lamarca

Both the 'Burrow-Wheeler Alinger (BWA)' and 'Bowtie 2' are used to perform mapping of short DNA sequence reads against a known genome. However, the computational strategies used by each of them are different. As a result, their performances vary. For example, 'Bowtie 2' seems to be considerably faster.

There are other candidares as well, the comparison could involve other programs. The good thing about these software packeges is that as they use pretty much the same input data, the comparison of the outcome is quite intuitive.





Intrusions in the cloud - A comparison of three approaches

Guðný Lára Guðmundsdóttir

I have researched some more and found that there have been some Intrustion detection systems suggested and/or implement in papers. Would it make sense to choose two of those, explain and compare so the hypotheses could be that one does some thing better?

An intrusion would be classified as the process of entering into a network without proper authentication.

I found this very recent paper from 2017 that classifies intrusions detection techniques into 4:

  1. Misuse detection
  2. Anomaly based
  3. Virtual Machine Introspection
  4. Hyper visor based introspection

I thought I could either compare 2-4 of those or compare the techniques used in one of those.

This “doing better” could be experimented by doing various intrusions and checking to see whether the system detected the intrusion or not, simply compare how many intrusions were detected per system.





Does design-driven development lead to more effective software?

Julia Elisabeth Haidn

The paper would try to answer this question and also include a comparison of design-driven software development and established software development techniques for evaluating the concept of “more effective”.

The word “effective” could mean e.g.: customer acceptance, coverage of customer needs, generating revenue (—> making a product successful), reducing risk, increasing speed and learning.





Sustainability of Diets With Low Environmental Negative Impact

Matteo Altobelli

Recent studies show that the impact of different diets on the environment is not exactly as we are imagining it. A common place is, in fact, to think that the vegan diet (based on the exclusive consumption of products of plant origin) is absolutely the best choice. Recent studies involving large national and international producer countries (such as the United States, Australia, Spain, Italy, etc.) have shown, however, that the vegan diet is less sustainable than many other variants of a vegeterian diet and even of some of the omnivorous one, as the intensive use of fields for cultivation in addition to the lack of exploitation of grazing land (which are often not suitable for cultivation), would give the opportunity to feed fewer people causing, in the long run, greater exploitation environment.

Regarding the healthy aspect, the research propose a study on samples that vary in terms of age and habits, in order to analyze the effects of the diets in different conditions. While, for the environmental aspect, land of different environments will be used. These soils will be used for a fairly long period of time and in order to meet the food demand of a number of people suitable for the area used. In the long term, it will therefore be possible to collect the results of the research.

The experiment at the environmental level will proceed as follows: - the land will be cultivated following the canons of organic farming (very similar to the “classical rotation”), rather than those of an intensive agriculture. The reasons are simple. Although organic farming can have yields up to 20% less than conventional agricolture, the environmental benefits of the former make up for significantly lower yields. In fact, even though conventional agriculture produces more food, it does so at the expense of the environment: loss of biodiversity, environmental degradation and serious impacts on ecosystem services. On the other hand, organic farming tends to store more carbon in the soil, improving its quality and reducing erosion. This type of agriculture reduces soil and water pollution and also greenhouse gas emissions. And it is more energy efficient because it is not based on synthetic fertilizers and pesticides. - at each harvest the quantity and quality of food produced will be measured; different parameters for the evaluation of land exploitation (pH, salinity, acidity, alkalinity, mineral fraction, total limestone and active limestone, organic substance, nitrogen, phosphorus, potassium, magnesium, calcium, heavy metals, biological activity) will be analyzed to study the state of the environment. - in soils where there would be livestock (in cases of omnivorous diet), a portion of the land dedicated exclusively to rest and, therefore, to grazing livestock will be added to the classical rotation.

After a sufficiently long period (3 to 5 years), the results will be compared and published.





Deep Reinforcement Learning for Multiplayer Games

Guðmundur Páll Kjartansson

Reinforcement Learning has been very successful in creating AI that can learn to play games. It is a technique based on behavioural psychology, where actions that lead to good results are rewarded and those that lead to bad results are punished. One notable work on RL is the 2013 paper by Deep Mind, where they demonstrated a learning technique that could teach itself 7 different Atari games [1]. They combined a Convolutional neural network for analysing video input together with a Markov Decision Process based (stochastic) learning model. A remaining question is, how well suited is their approach for learning to play multiplayer games where you have to predict the behaviour of other agents? If not, can we modify their technique based on some previous work [2][3] on multiplayer games?

For this paper, I will choose Scorched Earth and Worms as the multiplayer games. These are both turn-based and seem very fitting for reinforcement learning. We will assume 3 players that are all competing against each other.

Experiments: 1. Constant vs time-dependent learning rate:

Compare the win/loss ratio of constant and time-dependent learning rates with the Q-learning algorithm. Here, time-dependent means that the learning rate of the algorithm will decrease in each iteration as a function of the number of iterations. The assumption is that after some fixed amount of iterations, time-dependent learning will outperform any tested constant learning rate.

2. loss-to-win ratio decreases exponentially with more training time

Simple experiment. Here we assume a certain relationship between training time of the Q-learning algorithm and loss-to-win ratio … keeping all other parameters fixed.

3. Competing agents vs paranoid:

This is a more game-theoretic idea. Compare the performance of a Q-learning model that assumes all opponents are playing to win, with another model that assumes that all opponents are teamed up against the learning agent.





Comparison of Increased Block Sizes and the Lightning Network - Improving Transaction Processing of Cryptocurrencies

Elías Ingi Elíasson

One of the bigger problems in the cryptocurrency-world is processing transactions. In order for transactions in cryptocurrency to go through and be approved, they have to go through a rather slow process. The transactions are assigned to a so-called block, which can only store a limited amount of transactions. In order for the transactions to be resolved they are checked for validity and compared with previous transactions and the users account balance. Additionally each block has to find a specific hash in order to create the next block that will store the next batch of transactions, creating a chain of blocks, the blockchain. The process of finding a hash usually takes about 10 minutes. Those working towards resolving the blocks are called miners, and are heavily rewarded for finding the correct hash. This delay as well as the limited transaction storage space in each block has resulted in people adding transaction/transfer fees to their transactions that is rewarded to the miners so that their transactions will be the ones selected to process sooner than others.

Numerous ideas have come up on how this delay can be shortened, both for the purposes of faster transactions and also so that people aren't “forced” to add a fee to their transaction in order for them to be processed. Amongst those ideas is doubling the size of the blocks, so that they can store more transactions, which would result in more transactions being processed for each hash. Another idea is using a bidirectional payment channel called the Lightning Network. The Lightning Network offers two individuals that frequently engage in cryptocurrency transactions to create a “transfer-channel” between each other seperated from the blockchain. An example of this is a person that purchases a cup of coffee for 1 coin every day at the same coffeehouse. He makes an initial deposit to the channel of 100 coins and after having purchased 100 cups of coffee the channel balance is depleted and a single transfer transaction is then assigned to a block instead of 100 smaller ones, resulting in less transactions per block.

Looking into how these ideas differ and what the benefits/drawbacks of each idea is (and maybe adding other ideas that I haven't looked into) I think would make for a good comparative experiment for finding an ideal method for processing cryptocurrency transactions with respect to processing speed, number of transactions and transfer fees.





MongoDB, Oracle, Microsoft SQL Server - Which database system is better for an ERP system?

Jaroslav Fedorcák

Large companies, such as eshops leading in the market, depend a great deal on the performance of their ERP systems. In some cases companies develop their own ERP, rather than buying a box software, which leads to the fact that the perfomance of their whole internal IT resources depend on the database performance.

Possiblities of database systems are many, but the most used ones are MongoDB, Microsoft SQL Server and Oracle. These systems‘ performances and maintenance varies, which poses a question: which of these systems is better for storing company data and using it, especially during the most difficult seasons, such as christmas holidays.

The main properties important in such environment are: speed of inserts in logging tables, speed of inserts, updated, deletes and selects of tables with products/orders, speed of stored procedures (and mutliple chains of stored procedures calling each other) in different ways to handle checques, credit notes and other banking instruments.

Hypothesis: Given that NoSQL databases such as MongoDB lack stored procedures and its replacement is incorporation of javascript, for purposes of fast processing of orders on the database side, Oracle and SQL server are a better solution for an internal ERP system.

This hypothesis is only one, but it concerns all three systems.





Giulio Mori

Nowadays with the huge increase of the Bitcoin's price, the cryptocurrency are very popular and a lot of people want to invest money on it. But choose the right currency where to invest it's very hard and only people that has economic's knowledge can make a selection between them. With this paper I will compare some methods found in other scientific paper (Link (Links to an external site.)Links to an external site.) or (Link (Links to an external site.)Links to an external site.) and others, to try to predict which one will be the most profictable.

I was thinking to try two methods that are described in two different papers on different cryptocurrencyes and try to say witch one is the “best” thanks to these criteria:

  1. if they really manage to predict trends
  2. percentage of error
  3. percentage of possible gain








EOF

/var/www/cadia.ru.is/wiki/data/pages/public/sc-t-701-rem4-18-1/rem4-18-approvedtitles.txt · Last modified: 2024/04/29 13:33 by 127.0.0.1

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki