Science, asked by kkfffff6, 7 months ago

the roots of a plants always grow underground​

Answers

Answered by beabinu200742
1
It is essential for roots to grow down so they can explore the soil and maximise their water uptake. ... Scientists have long speculated that plants bend in response to gravity due to the redistribution of the plant hormone auxin in the tip of


Answered by pranjalsingh5
1

Answer:

Yes' plants grow in soil to take their nutrition and water which is essential for them to make food fo themselves

Similar questions