An algorithm for large-scale genomic analysis

4 years ago
Anonymous $9ruWwTnhZq

https://www.sciencedaily.com/releases/2019/12/191220105629.htm

Nowadays, the analysis of genetic data is becoming increasingly important, particularly in the field of personalized medicine. The number of human genomes sequenced each year is growing exponentially and the largest databases account for more than one million individuals. This wealth of data is extremely valuable for better understanding the genetic destiny of humanity, whether to determine the genetic weight in a particular disease or to better understand the history of human migration. To be meaningful, however, these big data must be processed electronically. "However, the processing power of computers remains relatively stable, unlike the ultra-fast growth of genomic Big Data," says Olivier Delaneau, SNSF professor in the Department of Computational Biology at UNIL Faculty of Biology and Medicine and at SIB, which led this work. "Our algorithm thus aims to optimize the processing of genetic data in order to absorb this amount of information and make it usable by scientists, despite the gap between its quantity and the comparatively limited power of computers."

Better understand the role of haplotypes

An algorithm for large-scale genomic analysis

Dec 22, 2019, 10:18pm UTC
https://www.sciencedaily.com/releases/2019/12/191220105629.htm > Nowadays, the analysis of genetic data is becoming increasingly important, particularly in the field of personalized medicine. The number of human genomes sequenced each year is growing exponentially and the largest databases account for more than one million individuals. This wealth of data is extremely valuable for better understanding the genetic destiny of humanity, whether to determine the genetic weight in a particular disease or to better understand the history of human migration. To be meaningful, however, these big data must be processed electronically. "However, the processing power of computers remains relatively stable, unlike the ultra-fast growth of genomic Big Data," says Olivier Delaneau, SNSF professor in the Department of Computational Biology at UNIL Faculty of Biology and Medicine and at SIB, which led this work. "Our algorithm thus aims to optimize the processing of genetic data in order to absorb this amount of information and make it usable by scientists, despite the gap between its quantity and the comparatively limited power of computers." > Better understand the role of haplotypes