Embark on a journey of knowledge! Take the quiz and earn valuable credits.
Take A QuizChallenge yourself and boost your learning! Start the quiz now to earn credits.
Take A QuizUnlock your potential! Begin the quiz, answer questions, and accumulate credits along the way.
Take A QuizKindly log in to use this feature. We’ll take you to the login page automatically.
LoginGeneral Tech Technology & Software 3 years ago
User submissions are the sole responsibility of contributors, with TuteeHUB disclaiming liability for accuracy, copyrights, or consequences of use; content is for informational purposes only and not professional advice.
But main feature required besides those is running big analyses on the database in maximum speed
So now all you need is 90TB+ of RAM and you're set. "Maximum" speed is a very relative concept.
I have got about 90TB of text in a ~200 tables. This is structured related data. Any true relational distributed and per formant database would do the job.
What is a "true relational distributed database"?
Let's flip this around. Let's say that you had 90 servers and they each held 1TB of data. What's your plan to perform joins amongst your 200 tables and 90 servers?
In general, cross-server joins, don't scale very well. Trying to run joins across 90 servers is probably going to scale even less. Partitioning 200 tables is a lot of work.
which other databases to keep track of generally in this text">context and which to drop off the list
OK, so there are lots of follow-up questions here:
No matter what stage you're at in your education or career, TuteeHUB will help you reach the next level that you're aiming for. Simply,Choose a subject/topic and get started in self-paced practice sessions to improve your knowledge and scores.
Kindly log in to use this feature. We’ll take you to the login page automatically.
Login
Ready to take your education and career to the next level? Register today and join our growing community of learners and professionals.
Your experience on this site will be improved by allowing cookies. Read Cookie Policy
Your experience on this site will be improved by allowing cookies. Read Cookie Policy
manpreet
Best Answer
3 years ago
Scenario: Think you have got 90TB of text in 200 tables. This is structured related data. compareable to dbpedia only more data. Any really relational and distributed and performant database would do the job. Don’t expect as many updates as a social network but about 500read queries/s 20updates/s But main feature required besides those is running big analyses on the database in high speed since the data shall be reworked and improved with machine learning like apache mahout constantly.
Now the first issue is, which database technologies to start with (or to wait for them beeing relased) to first maintain all that data with a relativly low amount of webvisitors but a high demand on analysis/machine learning running fast? And second, which other databases to keep track of for special particular purposes that may occure and which to drop off the list or to put in pairs of which only one(/the better) should be applyed.