Want this question answered?
Normalizaton
Normalization is a process of reducing redundancies of data in a database. If you don't normalize you will have to repeat data entry.
Data reduction in data mining refers to the process of reducing the volume of data under consideration. This can involve techniques such as feature selection, dimensionality reduction, or sampling to simplify the dataset and make it more manageable for analysis. By reducing the data, analysts can focus on the most relevant information and improve the efficiency of their data mining process.
Normalisation is process of taking data from a problem and reducing it to a set of relations. Meanwhile ensuring data integrity and eliminating data redundancy.
copy and paste stupid
Data checking, editing, proof-reading.
normalization
Vectorization is the process of converting raster data into vector data. the opposite is called rasterization.
Hardware
If we want to transmit the most secret data, we can use encryption process and transmit the data to the receiver. this process is called as encryption. regds nataraj
integrity is basically consistency and we need it so as to handle the voluminous data base. we can relate it with integrated courses that assure the consistency of the courses. It is important to assure the accuracy and dependability of stored data on the facts.
A place where we can process, store and deliver the digital information is called data center. Many data centers are available worldwide.