Skip to main content

2024 | OriginalPaper | Buchkapitel

9. Machine Learning for Big Data Analytics

verfasst von : Ümit Demirbaga, Gagangeet Singh Aujla, Anish Jindal, Oğuzhan Kalyon

Erschienen in: Big Data Analytics

Verlag: Springer Nature Switzerland

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This insightful chapter delves deeply into the enormous possibilities of using machine learning to extract meaningful insights from large amounts of data, which meticulously dissects the realm of supervised machine learning for big data analytics, unravelling the challenges inherent in its application and elucidating pre-processing methodologies essential for optimal outcomes. A comprehensive array of popular supervised machine learning algorithms is scrutinised, including Linear Regression, Logistic Regression, Decision Tree, Random Forest, Support Vector Machines, Naïve Bayes Classifier, and K-Nearest Neighbour. Transitioning seamlessly, the chapter navigates the landscape of unsupervised machine learning, shedding light on diverse techniques such as K-means Clustering, Hierarchical Clustering, Density-Based Spatial Clustering of Applications with Noise (DBSCAN), Gaussian Mixture Models, Principal Component Analysis, t-distributed Stochastic Neighbour Embedding (t-SNE), Apriori Algorithm, Isolation Forest, and Expectation-Maximisation. The chapter culminates by venturing into neural network algorithms, probabilistic learning fundamentals, and performance evaluation and optimisation techniques, providing a holistic panorama of machine learning paradigms tailored to the challenges of big data analytics.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat M.I. Jordan, T.M. Mitchell, Machine learning: trends, perspectives, and prospects. Science 349(6245), 255–260 (2015)MathSciNetCrossRef M.I. Jordan, T.M. Mitchell, Machine learning: trends, perspectives, and prospects. Science 349(6245), 255–260 (2015)MathSciNetCrossRef
2.
Zurück zum Zitat S. Garg, K. Kaur, G.S. Aujla, G. Kaddoum, P. Garigipati, M. Guizani, Trusted explainable AI for 6G-enabled edge cloud ecosystem. IEEE Wirel. Commun. 30(3), 163–170 (2023)CrossRef S. Garg, K. Kaur, G.S. Aujla, G. Kaddoum, P. Garigipati, M. Guizani, Trusted explainable AI for 6G-enabled edge cloud ecosystem. IEEE Wirel. Commun. 30(3), 163–170 (2023)CrossRef
3.
Zurück zum Zitat P. Gupta, A. Sharma, R. Jindal, Scalable machine-learning algorithms for big data analytics: a comprehensive review. Wiley Interdisc. Rev. Data Min. Knowl. Discov. 6(6), 194–214 (2016)CrossRef P. Gupta, A. Sharma, R. Jindal, Scalable machine-learning algorithms for big data analytics: a comprehensive review. Wiley Interdisc. Rev. Data Min. Knowl. Discov. 6(6), 194–214 (2016)CrossRef
4.
Zurück zum Zitat S. Mittal, O.P. Sangwan, Big data analytics using machine learning techniques, in 2019 9th International Conference on Cloud Computing, Data Science & Engineering (Confluence). IEEE (2019), pp. 203–207 S. Mittal, O.P. Sangwan, Big data analytics using machine learning techniques, in 2019 9th International Conference on Cloud Computing, Data Science & Engineering (Confluence). IEEE (2019), pp. 203–207
5.
Zurück zum Zitat A. Zheng, A. Casari, Feature Engineering for Machine Learning: Principles and Techniques for Data Scientists (O’Reilly Media, Inc., 2018) A. Zheng, A. Casari, Feature Engineering for Machine Learning: Principles and Techniques for Data Scientists (O’Reilly Media, Inc., 2018)
6.
Zurück zum Zitat J. Cai, J. Luo, S. Wang, S. Yang, Feature selection in machine learning: a new perspective. Neurocomputing 300, 70–79 (2018)CrossRef J. Cai, J. Luo, S. Wang, S. Yang, Feature selection in machine learning: a new perspective. Neurocomputing 300, 70–79 (2018)CrossRef
7.
Zurück zum Zitat D.C. Montgomery, E.A. Peck, G.G. Vining, Introduction to Linear Regression Analysis (Wiley, 2021) D.C. Montgomery, E.A. Peck, G.G. Vining, Introduction to Linear Regression Analysis (Wiley, 2021)
8.
Zurück zum Zitat T.G. Nick, K.M. Campbell, Logistic regression. Topics in Biostatistics (2007), pp. 273–301 T.G. Nick, K.M. Campbell, Logistic regression. Topics in Biostatistics (2007), pp. 273–301
9.
Zurück zum Zitat J.F. Magee, Decision Trees for Decision Making (Harvard Business Review Brighton, MA, USA, 1964) J.F. Magee, Decision Trees for Decision Making (Harvard Business Review Brighton, MA, USA, 1964)
11.
Zurück zum Zitat R.G. Brereton, G.R. Lloyd, Support vector machines for classification and regression. Analyst 135(2), 230–267 (2010)CrossRef R.G. Brereton, G.R. Lloyd, Support vector machines for classification and regression. Analyst 135(2), 230–267 (2010)CrossRef
12.
Zurück zum Zitat I.B.A. Peling, I.N. Arnawan, I.P.A. Arthawan, I.G.N. Janardana, Implementation of data mining to predict period of students study using Naive Bayes algorithm. Int. J. Eng. Emerg. Technol. 2(1), 53 (2017)CrossRef I.B.A. Peling, I.N. Arnawan, I.P.A. Arthawan, I.G.N. Janardana, Implementation of data mining to predict period of students study using Naive Bayes algorithm. Int. J. Eng. Emerg. Technol. 2(1), 53 (2017)CrossRef
13.
Zurück zum Zitat M. Irfan, W. Uriawan, O.T. Kurahman, M. Ramdhani, I. Dahlia, Comparison of Naive Bayes and k-nearest neighbor methods to predict divorce issues, in IOP Conference Series: Materials Science and Engineering, vol. 434, no. 1 (IOP Publishing, 2018), p. 012047 M. Irfan, W. Uriawan, O.T. Kurahman, M. Ramdhani, I. Dahlia, Comparison of Naive Bayes and k-nearest neighbor methods to predict divorce issues, in IOP Conference Series: Materials Science and Engineering, vol. 434, no. 1 (IOP Publishing, 2018), p. 012047
14.
Zurück zum Zitat T. Widiyaningtyas, M.I.W. Prabowo, M.A.M. Pratama, Implementation of k-means clustering method to distribution of high school teachers, in 2017 4th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI). IEEE (2017), pp. 1–6 T. Widiyaningtyas, M.I.W. Prabowo, M.A.M. Pratama, Implementation of k-means clustering method to distribution of high school teachers, in 2017 4th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI). IEEE (2017), pp. 1–6
15.
Zurück zum Zitat A. Kassambara, Practical Guide to Cluster Analysis in R: Unsupervised Machine Learning, vol. 1 (Sthda, 2017) A. Kassambara, Practical Guide to Cluster Analysis in R: Unsupervised Machine Learning, vol. 1 (Sthda, 2017)
16.
Zurück zum Zitat W.-T. Wang, Y.-L. Wu, C.-Y. Tang, M.-K. Hor, Adaptive density-based spatial clustering of applications with noise (DBSCAN) according to data, in 2015 International Conference on Machine Learning and Cybernetics (ICMLC), vol. 1. IEEE (2015), pp. 445–451 W.-T. Wang, Y.-L. Wu, C.-Y. Tang, M.-K. Hor, Adaptive density-based spatial clustering of applications with noise (DBSCAN) according to data, in 2015 International Conference on Machine Learning and Cybernetics (ICMLC), vol. 1. IEEE (2015), pp. 445–451
17.
Zurück zum Zitat D.A. Reynolds et al., Gaussian mixture models. Encyclopedia of Biometrics, vol. 741, no. 659–663 (2009) D.A. Reynolds et al., Gaussian mixture models. Encyclopedia of Biometrics, vol. 741, no. 659–663 (2009)
18.
Zurück zum Zitat N. Kambhatla, T.K. Leen, Dimension reduction by local principal component analysis. Neural Comput. 9(7), 1493–1516 (1997)CrossRef N. Kambhatla, T.K. Leen, Dimension reduction by local principal component analysis. Neural Comput. 9(7), 1493–1516 (1997)CrossRef
19.
Zurück zum Zitat A. Maćkiewicz, W. Ratajczak, Principal components analysis (PCA). Comput. Geosci. 19(3), 303–342 (1993)CrossRef A. Maćkiewicz, W. Ratajczak, Principal components analysis (PCA). Comput. Geosci. 19(3), 303–342 (1993)CrossRef
20.
Zurück zum Zitat M.C. Cieslak, A.M. Castelfranco, V. Roncalli, P.H. Lenz, D.K. Hartline, t-distributed stochastic neighbor embedding (t-sne): a tool for eco-physiological transcriptomic analysis. Marine Genomics 51, 100723 (2020)CrossRef M.C. Cieslak, A.M. Castelfranco, V. Roncalli, P.H. Lenz, D.K. Hartline, t-distributed stochastic neighbor embedding (t-sne): a tool for eco-physiological transcriptomic analysis. Marine Genomics 51, 100723 (2020)CrossRef
21.
Zurück zum Zitat Y.S. Koh, S.D. Ravana, Unsupervised rare pattern mining: a survey. ACM Trans. Knowl. Discov. Data (TKDD) 10(4), 1–29 (2016)CrossRef Y.S. Koh, S.D. Ravana, Unsupervised rare pattern mining: a survey. ACM Trans. Knowl. Discov. Data (TKDD) 10(4), 1–29 (2016)CrossRef
22.
Zurück zum Zitat F.T. Liu, K.M. Ting, Z.-H. Zhou, Isolation forest, in 2008 Eighth IEEE International Conference on Data Mining. IEEE (2008), pp. 413–422 F.T. Liu, K.M. Ting, Z.-H. Zhou, Isolation forest, in 2008 Eighth IEEE International Conference on Data Mining. IEEE (2008), pp. 413–422
23.
Zurück zum Zitat D. Xu, Y. Wang, Y. Meng, Z. Zhang, An improved data anomaly detection method based on isolation forest, in 2017 10th International Symposium on Computational Intelligence and Design (ISCID), vol. 2. IEEE (2017), pp. 287–291 D. Xu, Y. Wang, Y. Meng, Z. Zhang, An improved data anomaly detection method based on isolation forest, in 2017 10th International Symposium on Computational Intelligence and Design (ISCID), vol. 2. IEEE (2017), pp. 287–291
24.
Zurück zum Zitat T.K. Moon, The expectation-maximization algorithm. IEEE Signal Process. Mag. 13(6), 47–60 (1996)CrossRef T.K. Moon, The expectation-maximization algorithm. IEEE Signal Process. Mag. 13(6), 47–60 (1996)CrossRef
26.
Zurück zum Zitat Y. Cabanes, F. Barbaresco, M. Arnaudon, J. Bigot, Unsupervised machine learning for pathological radar clutter clustering: the p-mean-shift algorithm, in C &ESAR 2019 (2019) Y. Cabanes, F. Barbaresco, M. Arnaudon, J. Bigot, Unsupervised machine learning for pathological radar clutter clustering: the p-mean-shift algorithm, in C &ESAR 2019 (2019)
27.
Zurück zum Zitat J. Yang, S. Rahardja, P. Fränti, Mean-shift outlier detection and filtering. Pattern Recognit. 115, 107874 (2021)CrossRef J. Yang, S. Rahardja, P. Fränti, Mean-shift outlier detection and filtering. Pattern Recognit. 115, 107874 (2021)CrossRef
28.
Zurück zum Zitat J. Lawrence, Introduction to Neural Networks (California Scientific Software, 1993) J. Lawrence, Introduction to Neural Networks (California Scientific Software, 1993)
29.
Zurück zum Zitat J. Sietsma, R.J. Dow, Creating artificial neural networks that generalize. Neural Netw. 4(1), 67–79 (1991)CrossRef J. Sietsma, R.J. Dow, Creating artificial neural networks that generalize. Neural Netw. 4(1), 67–79 (1991)CrossRef
30.
Zurück zum Zitat T.L. Fine, Feedforward Neural Network Methodology (Springer, 2006) T.L. Fine, Feedforward Neural Network Methodology (Springer, 2006)
31.
Zurück zum Zitat J. Wu, Introduction to convolutional neural networks. National Key Lab for Novel Software Technology, vol. 5, no. 23 (Nanjing University, China, 2017), p. 495 J. Wu, Introduction to convolutional neural networks. National Key Lab for Novel Software Technology, vol. 5, no. 23 (Nanjing University, China, 2017), p. 495
32.
Zurück zum Zitat L.R. Medsker, L. Jain, Recurrent neural networks. Design Appl. 5(64–67), 2 (2001) L.R. Medsker, L. Jain, Recurrent neural networks. Design Appl. 5(64–67), 2 (2001)
33.
Zurück zum Zitat K. Qadeer, W.U. Rehman, A.M. Sheri, I. Park, H.K. Kim, M. Jeon, A long short-term memory (lstm) network for hourly estimation of pm2. 5 concentration in two cities of South Korea. Appl. Sci. 10(11), 3984 (2020) K. Qadeer, W.U. Rehman, A.M. Sheri, I. Park, H.K. Kim, M. Jeon, A long short-term memory (lstm) network for hourly estimation of pm2. 5 concentration in two cities of South Korea. Appl. Sci. 10(11), 3984 (2020)
34.
Zurück zum Zitat P. Malhotra, L. Vig, G. Shroff, P. Agarwal et al., Long short term memory networks for anomaly detection in time series, in Esann, vol. 2015 (2015), p. 89 P. Malhotra, L. Vig, G. Shroff, P. Agarwal et al., Long short term memory networks for anomaly detection in time series, in Esann, vol. 2015 (2015), p. 89
35.
Zurück zum Zitat L. Gonog, Y. Zhou, A review: generative adversarial networks, in 2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA). IEEE (2019), pp. 505–510 L. Gonog, Y. Zhou, A review: generative adversarial networks, in 2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA). IEEE (2019), pp. 505–510
36.
Zurück zum Zitat A. Creswell, T. White, V. Dumoulin, K. Arulkumaran, B. Sengupta, A.A. Bharath, Generative adversarial networks: an overview. IEEE Signal Process. Mag. 35(1), 53–65 (2018)CrossRef A. Creswell, T. White, V. Dumoulin, K. Arulkumaran, B. Sengupta, A.A. Bharath, Generative adversarial networks: an overview. IEEE Signal Process. Mag. 35(1), 53–65 (2018)CrossRef
37.
Zurück zum Zitat W. Wang, Y. Huang, Y. Wang, L. Wang, Generalized autoencoder: a neural network framework for dimensionality reduction, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (2014), pp. 490–497 W. Wang, Y. Huang, Y. Wang, L. Wang, Generalized autoencoder: a neural network framework for dimensionality reduction, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (2014), pp. 490–497
38.
Zurück zum Zitat T. Kohonen, The self-organizing map. Proc. IEEE 78(9), 1464–1480 (1990)CrossRef T. Kohonen, The self-organizing map. Proc. IEEE 78(9), 1464–1480 (1990)CrossRef
39.
Zurück zum Zitat W. Wen, U. Demirbaga, A. Singh, A. Jindal, R.S. Batth, P. Zhang, G.S. Aujla, Health monitoring and diagnosis for geo-distributed edge ecosystem in smart city. IEEE Internet Things J. (2023) W. Wen, U. Demirbaga, A. Singh, A. Jindal, R.S. Batth, P. Zhang, G.S. Aujla, Health monitoring and diagnosis for geo-distributed edge ecosystem in smart city. IEEE Internet Things J. (2023)
40.
Zurück zum Zitat A. Singh, S. Garg, R. Kaur, S. Batra, N. Kumar, A.Y. Zomaya, Probabilistic data structures for big data analytics: a comprehensive review. Knowl. Based Syst. 188, 104987 (2020)CrossRef A. Singh, S. Garg, R. Kaur, S. Batra, N. Kumar, A.Y. Zomaya, Probabilistic data structures for big data analytics: a comprehensive review. Knowl. Based Syst. 188, 104987 (2020)CrossRef
41.
Zurück zum Zitat T.L. Fine, Theories of Probability: An Examination of Foundations (Academic Press, 2014) T.L. Fine, Theories of Probability: An Examination of Foundations (Academic Press, 2014)
42.
Zurück zum Zitat K. Mitra, S. Saguna, C. Åhlund, R. Ranjan, Alpine: a Bayesian system for cloud performance diagnosis and prediction, in 2017 IEEE International Conference on Services Computing (SCC). IEEE (2017), pp. 281–288 K. Mitra, S. Saguna, C. Åhlund, R. Ranjan, Alpine: a Bayesian system for cloud performance diagnosis and prediction, in 2017 IEEE International Conference on Services Computing (SCC). IEEE (2017), pp. 281–288
43.
Zurück zum Zitat D.M. Blei, A. Kucukelbir, J.D. McAuliffe, Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112(518), 859–877 (2017)MathSciNetCrossRef D.M. Blei, A. Kucukelbir, J.D. McAuliffe, Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112(518), 859–877 (2017)MathSciNetCrossRef
44.
Zurück zum Zitat F. Dellaert, The expectation maximization algorithm. College of Computing (Georgia Institute of Technology, 2002) F. Dellaert, The expectation maximization algorithm. College of Computing (Georgia Institute of Technology, 2002)
45.
Zurück zum Zitat C.M. Carlo, Markov chain Monte Carlo and Gibbs sampling. Lect. Notes EEB 581(540), 3 (2004) C.M. Carlo, Markov chain Monte Carlo and Gibbs sampling. Lect. Notes EEB 581(540), 3 (2004)
46.
Zurück zum Zitat C. Nemeth, P. Fearnhead, Stochastic gradient Markov chain Monte Carlo. J. Am. Stat. Assoc. 116(533), 433–450 (2021)MathSciNetCrossRef C. Nemeth, P. Fearnhead, Stochastic gradient Markov chain Monte Carlo. J. Am. Stat. Assoc. 116(533), 433–450 (2021)MathSciNetCrossRef
47.
Zurück zum Zitat W. Neiswanger, C. Wang, E. Xing, Asymptotically exact, embarrassingly parallel MCMC, arXiv preprint arXiv:1311.4780 (2013) W. Neiswanger, C. Wang, E. Xing, Asymptotically exact, embarrassingly parallel MCMC, arXiv preprint arXiv:​1311.​4780 (2013)
48.
Zurück zum Zitat D.A. Nguyen, K.A. Nguyen, C.H. Nguyen, K. Than et al., Boosting prior knowledge in streaming variational bayes. Neurocomputing 424, 143–159 (2021)CrossRef D.A. Nguyen, K.A. Nguyen, C.H. Nguyen, K. Than et al., Boosting prior knowledge in streaming variational bayes. Neurocomputing 424, 143–159 (2021)CrossRef
49.
Zurück zum Zitat D. Greene, P. Cunningham, R. Mayer, Unsupervised learning and clustering. Machine Learning Techniques for Multimedia: Case Studies on Organization and Retrieval (2008), pp. 51–90 D. Greene, P. Cunningham, R. Mayer, Unsupervised learning and clustering. Machine Learning Techniques for Multimedia: Case Studies on Organization and Retrieval (2008), pp. 51–90
50.
Zurück zum Zitat F.E. Jamiy, A. Daif, M. Azouazi, A. Marzak, The potential and challenges of big data-recommendation systems next level application, arXiv preprint arXiv:1501.03424 (2015) F.E. Jamiy, A. Daif, M. Azouazi, A. Marzak, The potential and challenges of big data-recommendation systems next level application, arXiv preprint arXiv:​1501.​03424 (2015)
51.
Zurück zum Zitat D. Khurana, A. Koli, K. Khatter, S. Singh, Natural language processing: state of the art, current trends and challenges. Multimedia Tools Appl. 82(3), 3713–3744 (2023)CrossRef D. Khurana, A. Koli, K. Khatter, S. Singh, Natural language processing: state of the art, current trends and challenges. Multimedia Tools Appl. 82(3), 3713–3744 (2023)CrossRef
52.
Zurück zum Zitat M. Hossin, M.N. Sulaiman, A review on evaluation metrics for data classification evaluations. Int. J. Data Mining Knowl. Manag. Process 5(2), 1 (2015)CrossRef M. Hossin, M.N. Sulaiman, A review on evaluation metrics for data classification evaluations. Int. J. Data Mining Knowl. Manag. Process 5(2), 1 (2015)CrossRef
53.
Zurück zum Zitat T. Saito, M. Rehmsmeier, The precision-recall plot is more informative than the ROC plot when evaluating binary classifiers on imbalanced datasets. PloS One 10(3), e0118432 (2015)CrossRef T. Saito, M. Rehmsmeier, The precision-recall plot is more informative than the ROC plot when evaluating binary classifiers on imbalanced datasets. PloS One 10(3), e0118432 (2015)CrossRef
54.
Zurück zum Zitat H. Huang, H. Xu, X. Wang, W. Silamu, Maximum f1-score discriminative training criterion for automatic mispronunciation detection. IEEE/ACM Trans. Audio Speech Lang. Process. 23(4), 787–797 (2015)CrossRef H. Huang, H. Xu, X. Wang, W. Silamu, Maximum f1-score discriminative training criterion for automatic mispronunciation detection. IEEE/ACM Trans. Audio Speech Lang. Process. 23(4), 787–797 (2015)CrossRef
55.
Zurück zum Zitat Z.H. Hoo, J. Candlish, D. Teare, What is an ROC curve? (2017), pp. 357–359 Z.H. Hoo, J. Candlish, D. Teare, What is an ROC curve? (2017), pp. 357–359
56.
Zurück zum Zitat G.H. Golub, U. Von Matt, Generalized cross-validation for large-scale problems. J. Comput. Graph. Stat. 6(1), 1–34 (1997)MathSciNetCrossRef G.H. Golub, U. Von Matt, Generalized cross-validation for large-scale problems. J. Comput. Graph. Stat. 6(1), 1–34 (1997)MathSciNetCrossRef
57.
Zurück zum Zitat S. Suthaharan, Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning, vol. 36 (Springer, 2015) S. Suthaharan, Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning, vol. 36 (Springer, 2015)
58.
Zurück zum Zitat J. Brownlee, Probability for Machine Learning: Discover How to Harness Uncertainty with Python (Machine Learning Mastery, 2019) J. Brownlee, Probability for Machine Learning: Discover How to Harness Uncertainty with Python (Machine Learning Mastery, 2019)
59.
Zurück zum Zitat C. Molnar, Interpretable Machine Learning. Lulu. com (2020) C. Molnar, Interpretable Machine Learning. Lulu. com (2020)
Metadaten
Titel
Machine Learning for Big Data Analytics
verfasst von
Ümit Demirbaga
Gagangeet Singh Aujla
Anish Jindal
Oğuzhan Kalyon
Copyright-Jahr
2024
DOI
https://doi.org/10.1007/978-3-031-55639-5_9

Premium Partner