View Article

  • Spread of Misinformation through Social Media during the Aravalli Crisis

    1. School of Emerging Science & Technology, Gujarat University, Ahmedabad
    2. Department of Applied Sciences and Humanities, Parul Institute of Technology, Parul University, Vadodara, Gujarat

Abstract

The spread of misinformation on social media platforms has become a significant societal issue, affecting public behaviour and policy-making processes. In this study, we develop a six-compartment deterministic model to understand the dynamics of misinformation spread. The model contains susceptible, exposed, fake news, authentic news, doubtful news and restrained users. The system of linear ordinary differential equations (ODEs) is developed to represent the conversion of this compartment. The model includes key processes such as content exposure, spread of true and false information, development of doubt information and final withdrawal from sharing news. The analysis used to find equilibrium points and their stability provides the system’s behaviour. Numerical simulations are used to understand the system’s behaviour under different parameter values and observe how misinformation spreads or declines over time. This study gives a useful understanding of how misinformation spreads through social media through different users’ behaviour and helps to manage strategies to control and reduce its impact.

Keywords

Misinformation, Authentic information, social media, mathematical model, compartmental model, Dynamic system, Numerical Simulation, Stability analysis.

Introduction

In today’s digital world, social media has become one of the primary sources of news and information. However, the quick spread of misinformation on digital platforms has become a major problem, affecting both users and the general quality of online content [1]. Misinformation can be described as false or unintentional distribution on widely utilized platforms like Instagram, Facebook, Twitter, etc. The propagation of such information is influenced by a combination of psychological behaviour, social interaction and technological methods. As a result, it can have a negative impact, such as loss of trust in institutions and increasing confusion and emotional discomfort among individuals [2]. Recent studies show that misinformation spreads faster and reaches a wider audience than verified information. This effect is even more distinct during a crisis, where uncertainty, strong emotions and political pressure boost the rate and scale of its spread [3,4].   

The issue for analyzing misinformation dynamics in the context of environmental governance is the Aravalli crisis in India. Public discussions about changes in its legal protection rapidly spread across social media platforms, where news was often twisted or presented out of context. During this time, false claims, exaggerated narratives and partial representation of policy decisions spread widely among users. In many cases, unverified content and official communications were frequently overshadowed, leading to public uncertainty and division. This trend has been discovered in various contexts, environmental and crisis-related scenarios, where misinformation spread quickly through digital platforms adds uncertainty and decision-making processes [5,6].

Earlier research indicates that the propagation of misinformation on social media is influenced by social, psychological and technological factors. Platform elements like a quick sharing method and a huge network connectivity allow information, regardless of facts, to spread speedily among users. Additionally, algorithmic systems frequently emphasize content that causes a strong emotional reaction, thereby expanding sensational and inaccurate narratives [7]. From a behavioural perspective, individuals are open to accepting and sharing misinformation, especially increased by thinking-related biases, low digital learning and confirmation bias, especially when it comes to controversial issues like environmental activism and conservation [7,8].

Misinformation related to environmental issues creates critical challenges because it can twist ecological risks, consume public confidence in scientific results and interfere with knowledge-based decisions. Research on climate communication and environmental misinformation shows that the key misleading narratives can weaken assumptions, widen ideological divides and postpone group efforts in the environment [9,10]. The creation of a protected area of misinformation can also raise societal tensions and influence protest situations, such as decline or legal conflicts [5].

According to research in mathematical and computer science, the spread of misinformation quickly parallels epidemic processes, where stages such as exposure, acceptance and communication occur within structured social networks [11,12]. These modelling tools offer a useful insight into how misinformation spread, sustain and eventually reaches a stable state within online communities. This perspective is particularly effective for investigating the spread of false narratives during events such as the Aravalli-related debate.

As social media increasingly plays a dominant role as a key source of information, it is difficult to understand how misinformation spreads, especially during an environmental crisis. The purpose of this research is to gain a better understanding of how narratives emerge, spread and influence public perception in the domain of governance. The study aims to present insights by framing the Aravalli crisis within theoretical, practical and modeling approaches of misinformation research. These findings can helps to design effective neutralizers, improve digital awareness and support research based communication technique in future environmental crisis. 

LITERATURE REVIEW:

The detection of fake news has become a significant area of research in recent years. Most studies in this field focus on examining both textual features and visual elements present in news articles. The spread of misinformation can influence public opinion, policy perception and impact mental health (Arora et al., 2025). Research conducted by Dennis et al. (2025) suggests that emotionally engaging narratives significantly increase the probability of misinformation being shared within online groups. Misinformation by HimmaKadakas et al. (2022) refers to the dissemination of false or inaccurate information, often unintentionally, without the intent to mislead or deceive the audience. (Greeshma, R. et al. (2024)) A recent study by Raza et al. (2022) introduced a transformer-based model for detecting fake news, employing an encoder for learning and a decoder for prediction. For example, Aspneset et al. (2006) considered a setting with a single random source, no target, and no threshold on the number of infected nodes, but the objective is to minimize the sum of the number of monitors and the number of infected nodes (Amoruso et al. (2020)). Vosoughi et al. (2018) discovered that fake news spreads faster, deeper, and more extensively than true news on social media platforms. Ahmad et al. (2025) examined the motivation behind misinformation sharing using affordance theory and flow theory. Their research discovered that social media design characteristics such as immediate sharing, social validation systems, and immersive user experiences promote users to share content quickly without confirming its authenticity. Many mechanisms animate the flow of false information that generates false beliefs in an individual, which, once adopted, are rarely corrected (Vicario et al., 2016). Recent research has advanced the study of disinformation by using computational and mathematical modeling tools to better understand its dissemination dynamics. According to Chen et al. (2023), misinformation propagation through social media follows patterns of epidemic-like dynamics where exposure, adoption and retransmission occur within interconnected networks. Previous work by Yaqub et al. (2020) has shown that people lack trust in automated solutions for fake news detection. However, work is already being undertaken to increase this trust, for instance, by von der Weth et al. (2020).

FORMULATION AND DESCRIPTION OF MODEL:

To study the dynamics of misinformation spread through social media platforms, the total population is categorized into six distinct compartments according to individuals’ interaction with online content. The Susceptible (S) group represents users who have not yet encountered the information but are likely to be exposed through social media platforms. After encountering information, individuals enter into the Exposed (E) category, where they have seen the content but have not yet check it’s authenticity. From this point, some users may accept information as true and begin to share it, thus enter into the Fake news (F) compartment, which significantly contributes to the spread of misinformation. Meanwhile, other users may remain unsure and are classified into the Doubtable news (D) compartment, representing hesitation or partial belief. Compartment. Conversely, users who examine and verify the content before sharing using trusted sources move into Authentic news (A) compartment, which supports the spreading of factual information and limits misleading narratives. Lastly, the Restrained (R) class includes users who stop sharing content because of awareness, fact-checking efforts, platform policies or external controls. This compartment approach provides a clear and systematic mathematical framework for understanding the spread and control of misinformation within social media platforms.

Table 1. Description of model parameters:

Parameters

Description

Value

Source

 

Recruitment rate

0.18

Assumed

 

Exposure transmission rate

0.81

Assumed

Transmission rate of exposure to fake information

0.75

Assumed

 

Transmission rate of exposure to authentic information

0.45

Assumed

 

Transmission rate of exposure to doubtful information

0.34

Assumed

 

Recovery rate of doubtful to fake information

0.23

Assumed

 

Recovery rate of doubtful to authentic information

0.456

Assumed

 

Restrain the rate of fake information

0.36

Assumed

 

Restrain rate of authentic information

0.86

Assumed

 

Natural existence rate

0.28

Assumed

Model diagram:

Figure 1. Diagram of the spread of misinformation through social media

Figure 1 depicts the model diagram, which is made up of a dynamic system of differential equations that illustrate the rate of change over time described as follows:

The total population at time t, denoted by

It is classified into six distinct compartments based on spread of misinformation.

  1. Existence of Equilibrium Points:

Determination of equilibrium points by solving (1) stated above:

Endemic equilibrium point E*= (S*, E*, F*, A* D*, R*)                 (2)

where,

 

  1. Stability Analysis:

In this section, we will discuss local stability and global stability for the equilibrium point.

Local stability

Theorem 4.1 (Local stability of E*)

The endemic equilibrium point is locally asymptotically stable if

Proof. Evaluating the Jacobian matrix for the system at the point E*

 

 

 

 

 

Global stability

Since the model has a dynamic system of linear Ordinary Differential Equations, for a linear dynamic system, local asymptotic stability implies global asymptotic stability.

The equilibrium point of the system is globally asymptotically stable if all eigenvalues of the matrix ???? have negative real parts. (Differential equations, dynamical systems, and an introduction to chaos, 2013; Nonlinear systems, 2002)

  1. Numerical Simulation:

In this section, we simulated the dynamics of the spread of misinformation using the parametric values given in Table 1. We carry out a simulation and interpret the spread of misinformation using MATLAB.

Figure 2: Dynamics of misinformation spread in the model

Figure 2 shows the dynamics of misinformation spread through social media during the Aravalli case. The variation in the population of the respective compartment over time in hours. Initially, the susceptible and exposed population declined rapidly as people became aware of and interacted with Aravalli-related content on social media. The results show that misinformation spreads rapidly when unverified claims about the Aravalli region are widely shared on social media. However, as the availability of real information from scientific sources and government declarations increases, the propagation of false information reduces. Users in the doubtful category play a crucial role since prompt clarification can prevent them from spreading misleading information

Figure 3. Phase portrait of Exposed individuals (E) and Authentic Spreaders (A)

Figure 3 represents the phase portrait of authentic news spreaders and exposed individuals in the misinformation spread model during the Aravalli case. The horizontal axis indicates the exposed individuals, and the vertical axis indicates the authentic news spreaders. The arrows indicate the direction of change of the system. The blue curves represent the trajectories of the model over time. All trajectories move toward a common equilibrium point, indicating that the system gradually achieves stability. This behaviour shows that the interaction between exposed individuals and authentic information eventually stabilizes and helps in the reduction of the spread of misinformation. This suggests that as more exposed individuals receive authentic information, the spread of misinformation decreases and the system gradually approaches stability.

Figure 4. Phase portrait of Exposed individuals (E) and Fake Spreaders (F)

Figure 4 represents the phase portrait of fake news spreaders and exposed individuals in the misinformation spread model during the Aravalli case. The horizontal axis indicates the exposed individuals, and vertical axis indicates the fake spreaders. The arrows indicate the direction of change of the system. The blue curves represent the trajectories of the model over time. The trajectories gradually move toward a stable equilibrium point, indicating that the interaction between exposed individuals and fake information spreaders eventually stabilizes. This suggests that the spread of misinformation may decrease as the system approaches equilibrium.

Figure 5. Phase portrait of Exposed individuals (E) and Doubtful Spreaders (D)

Figure 5 represent the phase portrait of doubtful news spreaders and exposed individuals in the misinformation spread model during Aravalli case. The horizontal axis indicates the exposed individuals, and the vertical axis indicates the doubtful spreaders. The arrows indicate the direction of change of the system. The blue curves represent the trajectories of the model over time. The trajectories gradually move toward a stable equilibrium point, indicating that the interaction between exposed individuals and doubtful information spreaders eventually stabilizes. This suggests that as more individuals become doubtful about the information, the spread of misinformation tends to slow down and the system approaches stability.

Figure 6. Phase portrait of Doubtful Spreaders (D) and Authentic Spreaders (A)

Figure 6 represents the phase portrait of doubtful news spreaders and authentic news spreaders in the misinformation spread model during the Aravalli case. The horizontal axis indicates the doubtful users, and the vertical axis indicates the authentic information spreaders. The arrows indicate the direction of change of the system. The blue curves represent the trajectories of the model over time. The trajectories gradually move toward a stable equilibrium point, indicating that the interaction between doubtful users and authentic individuals eventually stabilize. This suggests that as authentic information increases and more individuals become doubtful of misinformation, the spread of false information tends to decrease and the system approaches stability.

Figure 7. Phase portrait of Doubtful Spreaders (D) and Fake Spreaders (F)

Figure 7 represents the phase portrait of fake news spreaders and doubtful individuals in the misinformation spread model during the Aravalli case. The horizontal axis indicates the doubtful users, and the vertical axis indicates the fake information spreaders. The arrows indicate the direction of change of the system. The blue curves represent the trajectories of the model over time. The trajectories gradually move toward a stable equilibrium point, indicating that the interaction between doubtful users and authentic individuals gradually stabilizes. This leads to a reduction in the spread of fake information over time.

Figure 8. Phase portrait of Authentic Spreaders (A) and Restrained individuals (R)

The figure 8 represents the phase portrait of authentic news spreaders and restrained individuals in the misinformation spread model during the Aravalli case. The horizontal axis indicates the authentic spreaders, and the vertical axis indicates the restrained individuals. The arrow indicates the direction of system dynamics. The blue curves represent the trajectories of the model over time. The trajectories gradually move toward a stable equilibrium point, indicating that the interaction between authentic spreaders and restrained individuals changes over time. This suggests that as authentic information spreads and more individuals become restrained from sharing content, the spread of misinformation decreases and the system gradually approaches stability.

Figure 9. Phase portrait of Fake Spreaders (F) and Restrained individuals (R)

Figure 9 represents the phase portrait of fake news spreaders and restrained individuals in the misinformation spread model during the Aravalli case. The horizontal axis indicates the fake spreaders, and the vertical axis indicates the restrained individuals. The arrow indicates the direction of system dynamics. Blue trajectories represent the solution paths under various initial conditions. All trajectories move towards the stable equilibrium point, which indicates that the misinformation may initially spread through fake spreaders but eventually reduces as more people become restricted and cease the spreading of misinformation.

CONCLUSION

This study presents a linear compartmental model to analyze the spread of misinformation through social media during the Aravalli crisis. The model divides the population into six compartments: Susceptible( ), Exposed( ), Fake news( ), Authentic news( ), Doubtable news( ) and Restrained individuals( ) to describe the dynamics of sharing and spreading information. The stability analysis shows that the equilibrium point is globally asymptotically stable, indicating that the spread of misinformation eventually decreases over time under the given conditions. The numerical simulation demonstrates that during the Aravalli case, the misinformation spread rapidly in the initial stage due to fake spreaders on social media. However, as time progresses, the number of restrained individuals increases, which helps reduce the spread of false information. Overall, the result indicates the importance of awareness, fact-checking and responsible information sharing in controlling the spread of misinformation on social media platforms.

REFERENCES

  1. Almaliki, Malik (2019, April). Online misinformation spread: A systematic literature map. Proceedings of the 2019 3rd International Conference on Information Systems and Data Mining 171-178.
  2. Alalawi, S., Baalfaqih, S., Almeqbaali, M., & Masud, M. M. (2023, November). Social Media Misinformation Propagation and Detection. Proceedings of the 2023 15th International Conference on Innovations in Information Technology (IIT) 240–245.
  3. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. science, 359(6380), 1146-1151.
  4. Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019, November). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240(112552).
  5. Islam, A. N., Laato, S., Talukder, S., & Sutinen, E. (2020). Misinformation sharing and social media fatigue during COVID-19: An affordance and cognitive load perspective. Technological forecasting and social change, 159(120201).
  6. M. Cinelli, Quattrociocchi, W., Galeazzi, A., Valensise, C., Brugnoli, E., Schmidt, A. L., Zola, P., Zollo, F., & Scala, A. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118.
  7. Ahmad, F. N., Ibrahim, N. Z. M., & Hamid, A. S. A. (2025, May). The impact of misinformation on social media. International Journal of Communication and Media Studies, 1(1), 9–12.
  8. Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50.
  9. Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the post-truth era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
  10. Essien, E. O. (2025). Climate change disinformation on social media: A meta-synthesis on epistemic welfare in the post-truth era. Social Sciences, 14(5), 304.
  11. Daley, D. J., & Kendall, D. G. (1965). Stochastic rumours. Journal of the Institute of Mathematics and Its Applications, 1(1), 42–55.
  12. Kumar, S., Saini, M., Goel, M., & Panda, B. S. (2021). Modeling information diffusion in online social networks using a modified forest-fire model. Journal of Intelligent Information Systems, 56(2), 355–377.
  13. Arora, S., Arora, S., Kumar, D., Agrawal, V., Gupta, V., & Vasdev, D. (2025). Examining the mental health impact of misinformation on social media using a hybrid transformer-based approach. arXiv.
  14. Denniss, E., & Lindberg, R. (2025). Social media and the spread of misinformation: infectious and a threat to public health. Health Promotion International, 40(2), daaf023.
  15. Greeshma, R., Biradar, D., Bharadwaj, G. R., Goutham, N., Girijamma, H. A., & Kallas, S. P. (2024). Social media misinformation. International Journal for Multidisciplinary Research (IJFMR), E-ISSN 2582-2160.
  16. Raza, S., & Ding, C. (2022). Fake news detection based on news content and social contexts: A transformer-based approach. International Journal of Data Science and Analytics, 13(4), 335–362.
  17. Amoruso, M., Anello, D., Auletta, V., Cerulli, R., Ferraioli, D., & Raiconi, A. (2020). Contrasting the spread of misinformation in online social networks. Journal of Artificial Intelligence Research, 69, 847–879.
  18. Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554–559.
  19. Chen, S., Xiao, L., & Kumar, A. (2023). Spread of misinformation on social media: What contributes to it and how to combat it. Computers in Human Behaviour, 141, 107643. 
  20. Yaqub, W., Kakhidze, O., Brockman, M. L., Memon, N., & Patil, S. (2020). Effects of credibility indicators on social media news sharing intent. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14.
  21. von der Weth, C., Jannach, D., & Müller, K. (2020). Machine learning explanations to prevent overtrust in fake news detection. arXiv.
  22. Aimeur, E., Amri, S., & Brassard, G. (2023). Fake news, disinformation and misinformation in social media: a review. Social Network Analysis and Mining, 13(1), 30.
  23. Hirsch, M. W., Smale, S., & Devaney, R. L. (2013). Differential equations, dynamical systems, and an introduction to chaos. Academic Press (60).
  24. Khalil, H. K., & Grizzle, J. W. (2002). Nonlinear systems Upper Saddle River, NJ: Prentice Hall, 3(126).

Reference

  1. Almaliki, Malik (2019, April). Online misinformation spread: A systematic literature map. Proceedings of the 2019 3rd International Conference on Information Systems and Data Mining 171-178.
  2. Alalawi, S., Baalfaqih, S., Almeqbaali, M., & Masud, M. M. (2023, November). Social Media Misinformation Propagation and Detection. Proceedings of the 2023 15th International Conference on Innovations in Information Technology (IIT) 240–245.
  3. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. science, 359(6380), 1146-1151.
  4. Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019, November). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240(112552).
  5. Islam, A. N., Laato, S., Talukder, S., & Sutinen, E. (2020). Misinformation sharing and social media fatigue during COVID-19: An affordance and cognitive load perspective. Technological forecasting and social change, 159(120201).
  6. M. Cinelli, Quattrociocchi, W., Galeazzi, A., Valensise, C., Brugnoli, E., Schmidt, A. L., Zola, P., Zollo, F., & Scala, A. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118.
  7. Ahmad, F. N., Ibrahim, N. Z. M., & Hamid, A. S. A. (2025, May). The impact of misinformation on social media. International Journal of Communication and Media Studies, 1(1), 9–12.
  8. Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50.
  9. Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the post-truth era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
  10. Essien, E. O. (2025). Climate change disinformation on social media: A meta-synthesis on epistemic welfare in the post-truth era. Social Sciences, 14(5), 304.
  11. Daley, D. J., & Kendall, D. G. (1965). Stochastic rumours. Journal of the Institute of Mathematics and Its Applications, 1(1), 42–55.
  12. Kumar, S., Saini, M., Goel, M., & Panda, B. S. (2021). Modeling information diffusion in online social networks using a modified forest-fire model. Journal of Intelligent Information Systems, 56(2), 355–377.
  13. Arora, S., Arora, S., Kumar, D., Agrawal, V., Gupta, V., & Vasdev, D. (2025). Examining the mental health impact of misinformation on social media using a hybrid transformer-based approach. arXiv.
  14. Denniss, E., & Lindberg, R. (2025). Social media and the spread of misinformation: infectious and a threat to public health. Health Promotion International, 40(2), daaf023.
  15. Greeshma, R., Biradar, D., Bharadwaj, G. R., Goutham, N., Girijamma, H. A., & Kallas, S. P. (2024). Social media misinformation. International Journal for Multidisciplinary Research (IJFMR), E-ISSN 2582-2160.
  16. Raza, S., & Ding, C. (2022). Fake news detection based on news content and social contexts: A transformer-based approach. International Journal of Data Science and Analytics, 13(4), 335–362.
  17. Amoruso, M., Anello, D., Auletta, V., Cerulli, R., Ferraioli, D., & Raiconi, A. (2020). Contrasting the spread of misinformation in online social networks. Journal of Artificial Intelligence Research, 69, 847–879.
  18. Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554–559.
  19. Chen, S., Xiao, L., & Kumar, A. (2023). Spread of misinformation on social media: What contributes to it and how to combat it. Computers in Human Behaviour, 141, 107643. 
  20. Yaqub, W., Kakhidze, O., Brockman, M. L., Memon, N., & Patil, S. (2020). Effects of credibility indicators on social media news sharing intent. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14.
  21. von der Weth, C., Jannach, D., & Müller, K. (2020). Machine learning explanations to prevent overtrust in fake news detection. arXiv.
  22. Aimeur, E., Amri, S., & Brassard, G. (2023). Fake news, disinformation and misinformation in social media: a review. Social Network Analysis and Mining, 13(1), 30.
  23. Hirsch, M. W., Smale, S., & Devaney, R. L. (2013). Differential equations, dynamical systems, and an introduction to chaos. Academic Press (60).
  24. Khalil, H. K., & Grizzle, J. W. (2002). Nonlinear systems Upper Saddle River, NJ: Prentice Hall, 3(126).

Photo
Nita H Shah
Corresponding author

School of Emerging Science & Technology, Gujarat University, Ahmedabad

Photo
Thacker Hardi
Co-author

School of Emerging Science & Technology, Gujarat University, Ahmedabad

Photo
Parmar Yashvi
Co-author

School of Emerging Science & Technology, Gujarat University, Ahmedabad

Photo
Jalpa Vaghela
Co-author

Department of Applied Sciences and Humanities, Parul Institute of Technology, Parul University, Vadodara, Gujarat

Thacker Hardi1, Parmar Yashvi1, Jalpa Vaghela2, Nita H Shah*1, Spread of Misinformation through Social Media during the Aravalli Crisis, Int. J. Sci. R. Tech., 2026, 3 (4), 1159-1167. https://doi.org/10.5281/zenodo.19903521

More related articles
Formulation and Evaluation of a Herbal Aloe Vera-B...
Amol Teke, Bhairavi Chavan, Machhandranath Mane , Kalyani Deshmuk...
Therapeutic Approach of Ayurveda in Pama Kusta: A ...
Ananya Latha Bhat, Chaitra H., Neethu M., Madhusudhana V., ...
Cognitive Properties of Coconut Oil Extract Against Aluminum Chloride-Induced Ne...
Oyeleye Samson Adesola , Akintola Adebola Olayemi , Kehinde Busuyi David, Lawal Onaopepo Abdulwakeel...
Emerging Multidrug-Resistant Fungal Pathogens: Epidemiology, Mechanisms, and Nov...
Haider Abbas, Amulya Singh, Arpit Maurya , Kavya Singh, Dr. Anupam Singh, ...
Related Articles
Exploring the Anti-Inflammatory Properties of Nyctanthes Arbor-Tristis: Formulat...
Rutuja Ishwarkar, Sakshi Londhe , Rupali Billari , Nikita Shingne , Priya Dandekar , Mayuri Zore , S...
Formulation and Evaluation of a Herbal Aloe Vera-Based Face Wash for Therapeutic...
Amol Teke, Bhairavi Chavan, Machhandranath Mane , Kalyani Deshmukh , Ravindra Jadhav , Dr. Rao Javva...
More related articles
Formulation and Evaluation of a Herbal Aloe Vera-Based Face Wash for Therapeutic...
Amol Teke, Bhairavi Chavan, Machhandranath Mane , Kalyani Deshmukh , Ravindra Jadhav , Dr. Rao Javva...
Therapeutic Approach of Ayurveda in Pama Kusta: A Case Study...
Ananya Latha Bhat, Chaitra H., Neethu M., Madhusudhana V., ...
Formulation and Evaluation of a Herbal Aloe Vera-Based Face Wash for Therapeutic...
Amol Teke, Bhairavi Chavan, Machhandranath Mane , Kalyani Deshmukh , Ravindra Jadhav , Dr. Rao Javva...
Therapeutic Approach of Ayurveda in Pama Kusta: A Case Study...
Ananya Latha Bhat, Chaitra H., Neethu M., Madhusudhana V., ...