Application of Big Data by Data Compression
One of the most important issue in AI is to efficiently use the big data saturating the world.
On the other hand, the performance of devices and network have peaked, and due to the problem of power consumption,
it is difficult to directly access the huge data.
We have worked on the new technology for this problem using data compression and related algorithms.
Short CV
- Hiroshi SAKAMOTO
- Professor of Graduate School of Computer Science and Systems Engineering, Kyushu Institute of Technology
- Ph.D.
- Doctorate of Science, Kyushu University, Dec. 1998
- Research Interests
- Intelligent Informatics, espisially, big data processing by data compression and algorithm (Google Scholar&DBLP)
- Academic Society Member
- JSAI, IEICE, DBSJ
Ongoing Research Project
Project 1: Sub-linear space compression algorithm
We need an innovation for fundamental algorithm in order to compete the big data.
We cannot, however, extract interesting imformation from the data due to the size:
Too large data to overlook is as good as inexistent data.
For this situation, we have challenged the sub-linear time/space computing.
Our research group propose a solution: sub-linear space compression using compression algorithm
as preprocessing for all computation.
See more information about this project.
Project 2: Stream data compression
Big data is often called stream data.
In this case, we cannot store the whole data in memory like a static data in database.
For even such case, we tried to develop the technology of real-time computing for stream data.
For this problem, we propose several compression algorithms as preprocess for large-scale data,
and implement applications for real data, e.g., FPGA compressor with University of Tsukuba (
See more information).