Submit your research to the International Journal "Notes on Intuitionistic Fuzzy Sets". Contact us at nifs.journal@gmail.com

Call for Papers for the 27th International Conference on Intuitionistic Fuzzy Sets is now open!
Conference: 5–6 July 2024, Burgas, Bulgaria • EXTENDED DEADLINE for submissions: 15 APRIL 2024.

Issue:A generalized net model of the stochastic gradient descent and dropout algorithm with intuitionistic fuzzy evaluations

From Ifigenia, the wiki for intuitionistic fuzzy sets and generalized nets
Revision as of 17:30, 1 February 2021 by Vassia Atanassova (talk | contribs) (Created page with "{{PAGENAME}} {{PAGENAME}} {{PAGENAME}}...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
shortcut
http://ifigenia.org/wiki/issue:nifs/26/4/80-89
Title of paper: A generalized net model of the stochastic gradient descent and dropout algorithm with intuitionistic fuzzy evaluations
Author(s):
Plamena Yovcheva
Prof. Dr. Assen Zlatarov” University, 1 “Prof. Yakimov” Blvd., Burgas 8010, Bulgaria
plamena.iovcheva@abv.bg
Sotir Sotirov
Prof. Dr. Assen Zlatarov” University, 1 “Prof. Yakimov” Blvd., Burgas 8010, Bulgaria
ssotirov@btu.bg
Published in: Notes on Intuitionistic Fuzzy Sets, Volume 26 (2020), Number 4, pages 80–89
DOI: https://doi.org/10.7546/nifs.2020.26.4.80-89
Download:  PDF (170  Kb, Info)
Abstract: In the paper, we consider a stochastic gradient descent algorithm in combination with a dropout method. We used the theory of intuitionistic fuzzy sets for the assessment of the equivalence of the respective assessment units. We also consider a degree of uncertainty when the information is not enough.
Keywords: Neural networks, Dropout algorithm, Generalized net, Stochastic gradient descent algorithm, Intuitionistic fuzzy sets
AMS Classification: 68Q85, 03E72.
References:
  1. Atanassov, K. (1983). Intuitionistic fuzzy sets. Proc. of VII ITKR's Session, Sofia, June (in Bulgarian).
  2. Atanassov, K. (1986). Intuitionistic fuzzy sets. Fuzzy Sets and Systems, 20(1), 87–96.
  3. Atanassov, K. (1991). Generalized nets. World Scientific, Singapore, New Jersey, London.
  4. Atanassov, K. (1999). Intuitionistic Fuzzy Sets. Springer, Heidelberg.
  5. Atanassov, K. (2007). On Generalized Nets Theory. “Prof. Marin Drinov” Academic Publishing House, Sofia.
  6. Atanassov, K. (2012). On Intuitionistic Fuzzy Sets Theory. Springer, Berlin.
  7. Atanassov, K. (2016). Generalized Nets as a Tool for the Modelling of Data Mining Processes. Innovative Issues in Intelligent Systems, Vol. 623, Studies in Computational Intelligence, 161–215.
  8. Atanassov, K., & Sotirov, S. (2006). Optimization of a neural network of self-organizing maps type with time-limits by a generalized net. Advanced studies in Contemporary Mathematics, 13(2), 213–220.
  9. Atanassov, K., Sotirov, S., & Antonov, A. (2207). Generalized net model for parallel optimization of feed-forward neural network. Advanced studies in Contemporary Mathematics, 15(1), 109–119.
  10. Barrow, E., Eastwood, M., & Jayne, Ch. (2016). Selective Dropout for Deep Neural Networks. Neural Information Processing, 519–528.
  11. Bureva, V. (2014). Intuitionistic fuzzy histograms in grid-based clustering. Notes Intuitionistic Fuzzy Sets, 20(1), 55–62.
  12. Bureva V., Sotirova, E., & Atanassov, K. (2014). Hierarchical generalized net model of the process of clustering. Issues in Intuitionistic Fuzzy Sets and Generalized Nets, Vol. 1, Warsaw School of Information Technology, 73–80.
  13. Bureva V., Sotirova, E., & Atanassov, K. (2014). Hierarchical generalized net model of the process of selecting a method for clustering. 15th Int. Workshop on Generalized Nets, Burgas, 16 October, 39–48.
  14. Bureva, V., Sotirova, E., & Chountas, P. (2015). Generalized Net of the Process of Sequential Pattern Mining by Generalized Sequential Pattern Algorithm (GSP). Intelligent Systems’2014, Springer, Cham, 2015, 831–838.
  15. Fukushima, K. (2005). Restoring partly occluded patterns: a neural network model, Neural Networks, 18(1), 33–43.
  16. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning, The MIT Press.
  17. Hagan, M., Demuth, H., & Beale, M. (2010). Neural Network Toolbox 7.
  18. Krawczak, M. (2003). Generalized Net Models of Systems. Bulletin of Polish Academy of Science.
  19. Krizhevsky, A., Sutskever, I., & Hinton, G. (2012). Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems, 25, 1106–1114.
  20. LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature 521.7553: 436.
  21. Sotirov, S. (2003). Modeling the algorithm Backpropagation for training of neural networks with generalized nets – part 1. Proceedings of the Fourth International Workshop on Generalized Nets, Sofia, 23 September 2003, 61–67.
  22. Sotirov, S. (2006). Generalized net model of the accelerating backpropagation algоrithm. Proceefings of the Jangjeon Mathematical Society, 2006, 217–225.
  23. Sotirov, S. (2010). Generalized net model of the Time Delay Neural Network. Issues in Intuitionistic Fuzzy Sets and Generalized nets, Warsaw, 125–131.
  24. Sotirov, S., & Krawczak, M. (2007). Modeling the algorithm Backpropagation for learning of neural networks with generalized nets – Part 2. Issues in Intuitionistic Fuzzy Sets and Generalized nets, Warszawa, 2007, 65–70.
  25. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 15(1), 1929–1958.
  26. Torralba, A., Fergus, R., & Weiss, Y. (2008). Small codes and large databases for recognition. In Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR’08), 1–8.
  27. Tsuruoka, Y., Tsujii, J., & Ananiadou, S. (2009). Stochastic gradient descent training for l1-regularized log-linear models with cumulative penalty. Proceedings of the AFNLP/ACL ‘09.
  28. Zadeh, L. A. (1965). Fuzzy Sets. Information and Control, 8, 333–353.
Citations:

The list of publications, citing this article may be empty or incomplete. If you can provide relevant data, please, write on the talk page.