Sunday, April 26, 2009

Role of Adenosine Transport Inhibition in the Antinociceptive and Antidepressant Effects of Tricyclic Antidepressants

These are some articles suggesting that tricyclics may produce some of their antinociceptive or antidepressant effects by inhibiting adenosine transport, an effect that would tend to increase extracellular adenosine and producing increases in plasma-membrane adenosine receptor activation. These articles can get kind of crazy, and they don't show that the antidepressants are acting only by adenosinergic mechanisms or anything. One could conceivably show that the effects of many drugs could be partially blocked by one or another adenosine receptor antagonist, because adenosine receptors affect most other neurotransmitter systems and countless other processes. For example, a tricyclic could bind to some "non-adenosine-binding" receptor or block some transporter that doesn't transport purines, and that action could indirectly interfere with adenosine-receptor activation or adenosine-receptor-mediated signal transduction events. Also, the acute effect on adenosine receptor subtypes (predominance of A1 adenosine receptor activation, for example) might not be the same as the effect in the long term (the A1 adenosine receptor density or sensitivity, for example, might decrease).

The tricyclics have many mechanisms other than serotonin and noradrenaline reuptake inhibition and other than adenosine transport inhibition that are thought to contribute to their antidepressant or antinociceptive (neuropathic pain-relieving) effects. For example, the anticholinergic effects create problems but nonetheless may contribute to their antidepressant effects. And the tricyclics are thought to relieve neuropathic pain, in part, by blocking sodium channels on neurons (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=tricyclic+sodium+channel), and they are like local anesthetics (lidocaine, procaine, etc., exert anesthetic effects by blocking sodium channels) in that regard. They also may relieve neuropathic pain partly by inhibiting noradrenaline and serotonin reuptake and thereby increasing the descending inhibitory influence that neurons in the locus ceruleus and locus subceruleus and raphe nuclei exert on ascending nociceptive transmission in the dorsal horn of the spinal cord. But those mechanisms are looking to be less important than the sodium-channel-blocking effects. There can also be an indirect sodium channel blocking effect of tricyclics on some cell types that's secondary to their anticholinergic effects (muscarinic acetylcholine receptor antagonism). And some of those pain articles discuss the complex interactions of the tricyclics' adenosinergic and catecholaminergic effects in animal models of hyperalgesia. But these articles showing adenosinergic effects of tricyclics are nonetheless interesting.

The apparent inhibitory effect of some tricyclics on adenosine transport seems to be one of their mechanisms of action that could plausibly occur in vivo [Phillis, 1984: (http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=1987110&blobtype=pdf)(http://www.ncbi.nlm.nih.gov/pubmed/6487906); Sawynok et al., 2005: (http://www.ncbi.nlm.nih.gov/pubmed/16156010)] Phillis (1984) discusses some of the lines of pharmacological evidence suggesting that the concentrations of the tricyclics in the interstitial fluid in the CNS might be high enough, in response to commonly-used dosages, to produce meaningful inhibition of adenosine transport (adenosine "reuptake"). Phillis (1984), for example, found that some tricyclic antidepressants inhibited adenosine transport and potentiated the adenosine-induced decreases in the firing rates of neurons from the cerebral cortices of rats.

There are lots of other articles showing these types of effects of tricyclics (a lot of these articles test amitriptyline) in animal models of depression or hyperalgesia [Esser and Sawynok, 2000: (http://www.ncbi.nlm.nih.gov/pubmed/10884512); Sawynok et al., 2008: (http://www.ncbi.nlm.nih.gov/pubmed/18562097); Esser et al., 2001: (http://www.ncbi.nlm.nih.gov/pubmed/11711033); Sawynok et al., 1999: (http://www.ncbi.nlm.nih.gov/pubmed/10204717); Skrabanja et al., 2005: (http://bja.oxfordjournals.org/cgi/content/full/94/5/556)(http://www.ncbi.nlm.nih.gov/pubmed/15722385?dopt=Abstract); Waldron et al., 2004: (http://www.ncbi.nlm.nih.gov/pubmed/15381049); Golembiowska and Dziubina, 2001: (http://www.ncbi.nlm.nih.gov/pubmed/11414661); Enriquez-Castillo et al., 2008: (http://www.ncbi.nlm.nih.gov/pubmed/18436269)].

Tuesday, April 21, 2009

Effects of Free Fatty Acids on ACTH and CRH Release and Responsivenes, on Sympathetic Activity, and, Potentially, Cerebral Blood Flow

This article [Migrenne et al., 2006: (http://diabetes.diabetesjournals.org/cgi/content/full/55/Supplement_2/S139)(http://cat.inist.fr/?aModele=afficheN&cpsidt=18366946)] (why is the journal Diabetes not indexed in Pubmed?) is really interesting, and the authors discuss research showing that the infusion of oleate or other free fatty acids (FFAs) into the carotid artery can cause those FFAs to enter hypothalamic neurons and either augment or decrease insulin release, in either a plasma-glucose-dependent or a glucose-independent manner, by altering the sympathetic outflow from the brain to the pancreas and other sites. The beta-oxidation of FFAs, in the hypothalamus and other parts of the brain, is required for many of these effects to occur, as discussed by Migrenne et al. (2006). This is relevant to the possibility that elevations in some saturated FFAs may produce mood-elevating or mild anticonvulsant effects, even as they may contribute to insulin resistance and other undesirable conditions, etc. [(http://hardcorephysiologyfun.blogspot.com/2009/04/protection-against-ischemic-damage-by.html)(http://hardcorephysiologyfun.blogspot.com/2009/04/low-cholesterol-levels-and-risk-of.html)]. Kok et al. (2004) [Kok et al., 2004: (http://ajpendo.physiology.org/cgi/content/full/287/5/E848)(http://www.ncbi.nlm.nih.gov/pubmed/15280154?dopt=Abstract)] cite research (reference 63) showing that high-fat diets tend to increase FFA levels, and this is fairly well-known to be the case, in my opinion. Kok et al. (2004) found that the acipimox-induced decreases in FFA levels had reduced ACTH levels in obese people. Although the authors of many articles present research to show that FFAs produce sympathetic activation or increase ACTH release, Lanfranco et al. (2004) [Lanfranco et al., 2004: (http://jcem.endojournals.org/cgi/content/full/89/3/1385)(http://www.ncbi.nlm.nih.gov/pubmed/15001638?dopt=Abstract)] found that an acute increase in plasma FFA levels reduced both cortisol and ACTH, and the authors discussed evidence suggesting the FFAs had exerted their inhibitory influence on ACTH secretion by acting on the hypothalamus (i.e. acting on neurons or astrocytes). There's some evidence that FFAs can increase sympathetic activation by acting on plasma membrane ion channels, and unsaturated FFAs can inhibit or otherwise affect beta-adrenoreceptor activation. FFAs can modify ligand binding to a number of different G-protein coupled receptors. Many of the effects on the sympathetic outflow from the CNS appear to be the result of the beta-oxidation of FFAs in the hypothalamus, though, presumably in astrocytes. Tataranni et al. (1999) [Tataranni et al., 1999: (http://www.pnas.org/cgi/content/full/96/8/4569)(http://www.ncbi.nlm.nih.gov/pubmed/10200303)] found that the elevation in plasma FFA levels after a meal correlated positively with regional cerebral blood flow (rCBF) to the dorsolateral prefrontal cortex, in association with an increase in satiety after the meal. That's a significant finding, and it could partly be a result of the beta-oxidation of those FFAs in cerebral vascular endothelial cells. The dorsolateral prefrontal cortex is obviously a site whose neuronal activity is thought to be crucially important in cognitive functioning and mood regulation, etc. [(http://scholar.google.com/scholar?q=%22dorsolateral+prefrontal+cortex%22+%22cerebral+blood+flow%22+depression&hl=en&lr=); (http://scholar.google.com/scholar?hl=en&lr=&q=%22dorsolateral+prefrontal+cortex%22+%22cerebral+blood+flow%22+cognitive)]. The recovery from depression was associated with increases in rCBF to the dorsolateral prefrontal cortex, for example [Bench et al., 1995: (http://www.ncbi.nlm.nih.gov/pubmed/7675913)]. It should be noted that something like resistance exercise tends to elevate both FFAs and ACTH and cortisol levels. The elevations in cortisol levels following resistance exercise can be very significant, and they're not really "bad," in my opinion. One can view that type of elevation as being a "strong signal" to essentially override glucocorticoid resistance at the level of the CNS or even in cells outside the brain. In asthma, for example, responsiveness to beta-adrenoreceptor agonists, which can produce anti-inflammatory effects on many cell types, can be restored within 24 hours by glucocorticoid administration. In chronic stress and depression, the issue tends not to be elevations in cortisol per se but resistance to feedback inhibition of ACTH release, by the pituitary, and CRH release from the hypothalamus. CRH generally activates noradrenergic neurons in the locus ceruleus, and an acute increase in noradrenaline availability in the hypothalamus can decrease CRH release from hypothalamic neurons [Hillhouse et al., 1975: (http://www.ncbi.nlm.nih.gov/pubmed/1079076); Valentino et al., 1988: (http://www.jneurosci.org/cgi/reprint/8/3/1016.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/3258021)]. Those types of regulatory mechanisms don't work very effectively, even under the best of circumstances, and the regulation of CRH release is very complex.

Monday, April 20, 2009

Increase in Nucleotide Absorption and Retention During the Fasted State: Bioavailability Issues and Potential Problems With Enteric-Coated Tablets

These articles [Gross et al., 1988: (http://www.ncbi.nlm.nih.gov/pubmed/3390463); Gross and Savaiano, 1991: (http://www.ncbi.nlm.nih.gov/pubmed/2009279)] are really good and show that the retention, by the intestinal tissues themselves, of intrajejunally-administered nucleosides or nucleic acids can be roughly twice as great in the fasted state as in the "fed" state. Those articles are potentially confusing, because, normally, the retention of nucleosides or nucleotides in the intestines (i.e. the salvage of nucleosides and incorporation into nucleotide pools in the epithelial or smooth muscle cells, etc.) would be less-than desirable, from the standpoint of bioavailability. But what those authors are saying is that xanthine oxidase activity is lower in the fasted state. Those articles tell me that administering purines, in particular, during the fasted state, as discussed below and in past postings [see Carver and Walker, 1995, cited and discussed here: (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html); see here, also: (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html)], is likely to produce both greater bioavailability, as discussed in those past postings, and greater salvage, by the target cells that the nucleotides or their metabolites enter, of those purines. That's just my opinion. I didn't know the effect had been shown to be that large. Ho et al. (1979) [Ho et al., 1979: (http://jn.nutrition.org/cgi/reprint/109/8/1377.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/458492)] found even larger increases in the absorption of nucleosides or nucleic acids in the fasted state, although those large, relative increases in tissue content, in the fasted state vs. the fed state, appear to be partially or largely due to increases in the absorption of the nucleotides or nucleic acids (purine bases).

It's important to note that the bioavailability is also likely, in my opinion, to be enhanced in the fasted state. The bioavailability is partly a function of how rapidly a physiological compound, such as a nucleotide, enters solution in the intraluminal fluid. I forget where the reference is, but the intraluminal fluid volume in the stomach or along a segment of the small intestine can be remarkably small and can be something like 15-20 mL. The general concept is that many physiological compounds (nucleotides in particular) can be transported into and metabolized by any cell they come in contact with. If a person takes something like guanosine or adenosine, as a free nucleoside, the low solubilities will, in my opinion, significantly limit the bioavailability of those nucleosides by slowing the rate of dissolution in the GI tract. The undissolved nucleosides will slowly enter solution, as the fraction that is dissolved is transported into cells or has diffused away, by passive diffusion. Savaiano et al. (1980) [Savaiano et al., 1980: (http://jn.nutrition.org/cgi/reprint/110/9/1793.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/7411237)] found evidence suggestive of extreme differences in the bioavailabilities of nucleic acids and nucleosides, such that the intravenous (i.v.) administration produced levels of tissue retention that were 3-59 times the levels produced by oral administration. Usually, the ratio of the i.v. to oral bioavailability of a drug, expressed as the ratio of the areas under the serum concentration vs. time curves [AUC(i.v.)/AUC(oral)], is maybe between 2 and 5 or 7 or something like that. Those differences found by Savaiano et al. (1980) are not especially relevant for human dosing, however, because the solubilities of both nucleic acids and nucleosides are drastically lower than the solubilities of the disodium salts of guanosine monophosphate (GMP and adenosine monophosphate (AMP) or triphosphate (ATP), for example [or the disodium salt of inosine monophosphate (IMP)]. Those solubility differences could essentially mean that most of the nucleic acids or nucleosides would be degraded to uric acid, in humans, or to uric acid and then allantoin, in animals, before they could even enter the portal circulation, etc, in my opinion. Other salts of AMP or GMP or ATP also display deficient solubilities, and those solubility data are freely available on countless sites on the internet. Many of the researchers who have used oral guanosine or GMP as anticonvulsants, in animal experiments, have discussed those solubility issues. The authors of many of those older articles were evidently not aware of those issues, however, in my opinion, and they're very important issues.

Another major problem with oral purine dosing is the use of enteric coatings, and I've discussed this in detail previously (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html). Most enteric coatings would be expected to severely and unnecessarily reduce the bioavailabilities of orally-administered nucleotides or nucleosides, in my opinion. Many enteric coated tablets could potentially not dissolve in the GI tract, in my opinion, because the pH in many people would not be expected to be high enough to allow the coatings to dissolve, as discussed by Fallingborg et al. (1999), cited below. Additionally, the use of tablets could be expected to produce the same, drastic slowing of entry into solution that a low level of solubility would be expected to produce, in my opinion. Persky et al. (2003) [Persky et al., 2003: (http://www.pharmacy.unc.edu/pkpd/AMP%20Articles/Persky%20et%20al%20Clin%20Pharmacok%202003.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/12793840)] discussed the fact that the rate of dissolution of physiological substrates, such as creatine, can be an important pharmacokinetic variable to consider, and these types of pharmacokinetic considerations are potentially more important for maximizing the bioavailabilities of nucleotides or other physiological compounds, in my opinion, than those considerations are for maximizing the bioavailabilities of drugs. Many drugs cannot be extensively or even partially metabolized by every cell in the body. With physiological substrates, time (i.e. pharmacokinetics) is of the essence, so to speak, because there is both the rate of uptake, by endothelial cells or cells in the liver, and the rates of degradation by every cell the substrates are available to. Even in the case of creatine, Deldicque et al. (2008) [Deldicque et al., 2008: (http://www.ncbi.nlm.nih.gov/pubmed/17851680)], discussed here (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html), found that the Cmax of plasma creatine, a reflection of an "improvement" in the kinetics of absorption or entry into the systemic circulation, etc., was higher in response to the administration of creatine monohydrate in a water solution (i.e. pre-dissolved) than in response to its administration in foods, which slow the rate of entry of creatine monohydrate into solution. A lower Cmax could be expected, in my opinion, to decrease the fraction of nucleotides, for example, that would gain entry into the brain and be salvaged, as opposed to being degraded into uric acid, by cells in the brain.

Some of the confusion surrounding these issues may be the result of some lingering misconceptions that many people, even researchers, evidently are holding onto. The fasted state in a human means any time 12 or more hours after the previous meal, although I've seen the 10-hour time point used as a marker for the beginning of the "fasted" state. So this means the only true fasted state is likely to be the time period in the morning, before breakfast. Why is this the case? Fallingborg (1999) [Fallingborg et al., 1999: (http://www.ncbi.nlm.nih.gov/pubmed/10421978)] discussed large numbers of studies on the time course with which food moves out of the stomach and into the duodenum and jejunum and so on, and much of the data that Fallingborg (1999) discussed had been collected from experiments using different types of devices that transmit data on the pH and other variables in the gastrointestinal (GI) tract. Some of those are pill-sized devices that have tiny video cameras in them, but I'm not sure those were in use in 1999. After a person eats the first meal of the day, that food may, depending on the sizes and frequencies of the subsequent meals (meaning any food that is eaten), remain in the stomach for between ~2.6 and 14.5 hours (Fallingborg et al., 1999). The gastric residence time (GRT) of tiny, mechanized capsules, with pH sensors in them (the pH is a measure of acidity, such that pH values below or above 7 are "acidic" or "basic"/"alkaline," respectively) is between 1.1 and 1.9 hours in the fasted state, but the GRT for the same capsule can be *up to 14.5 hours* in a person who takes the capsule at breakfast and eats every couple of hours during the rest of the day. Fallingborg (1999) discusses the fact that a single, small meal, eaten in the morning, has been shown to only increase the GRT of the capsule to ~2.6 hours. Fallingborg (1999) discusses the fact that, in the fasted state, the interdigestive migrating myoelectric complex (IMMC), which is phase III of a series of cyclic, contractile events in the smooth muscle that lines the stomach, allows solid food to exit the stomach about every 2 hours. When a person eats a single meal or, in particular, meals every 2-3 hours, the cyclic or "phasic" aspects of these contractions are abolished or "frozen", and food may not exit the stomach for many hours (up to 14 or 14.5). This is very important for understanding the major problems that exist, in my opinion, with enteric coatings for many preparations of (i.e. products containing) SAM-e or ATP disodium, etc., as discussed previously (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html).

This residence time applies to solid substances that cannot enter solution and diffuse into the small intestine. If a person eats a water-soluble nutrient or sugar or amino acid or nucleotide, those substances can exit the stomach in aqueous (water) solution very rapidly. But even the slowing effect that is produced by the food in the stomach can limit the bioavailability of a water-soluble substance, such as creatine, in the "fed" state. So the stomach may not be completely empty until ~4 am or even later, if a person eats his or her last meal at 6 pm or something. The paper by Fallingborg (1999) is superb, and the author cites 183 papers and goes into exhaustive detail on all of these considerations.

A major point that Fallingborg (1999) makes is that statements about taking enteric-coated (acid-resistant) tablets "between meals" or "on an empty stomach, between meals" make no sense, because the stomach does not empty between meals. When a person is told to take some of these enteric-coated SAM-e or ATP disodium tablets (or other enteric-coated tablets) "between meals, on an empty stomach," the tablet may not exit the stomach and have any hope of releasing its contents until up to 14.5 hours after the person has taken it and eaten many subsequent meals. More importantly, Fallingborg (1999) discusses research showing that the mean pH in the duodenum of humans is ~6.22. The duodenal pH has been shown to range from 5.66 to 6.4 in other articles (Fallingborg, 1999). In the jejunum, the upper part of the true small intestine proper, is about 4.92 in the fasted state (a median value) and 6.08 after a meal. The pH is thought to only increase to above 7, to 7.4-7.6 (Fallingborg, 1999), in the distal ileum, which is an almost shocking fact that helps to explain the many problems, such as intestinal strictures and so on, with enteric-coated tablets that have been reported in the literature. The pH in the proximal small intestine, therefore, ranges from ~6.08 to ~6.49 (Fallingborg, 1999), when one looks at the data from multiple articles. But many of the enteric coatings do not dissolve until the pH is some amount greater than 7, and yet the jejunum is the site at which enteric-coated tablets are supposed to dissolve. Enteric coatings are polymeric substances, generally, whose solubility is pH-dependent. That means they can't dissolve in fluids that display pH values below some critical range of numbers, and the lower limit of the range may be 7.5 or 8 or some other value and may depend on the particular formulation used by the manufacturer.

Those data on the pH of the intraluminal fluid mean that the dissolution of enteric coatings could be very problematic, in my opinion. One explanation for the misconceptions about the pH in the intraluminal fluid might be that the pH of bile is ~8.03 (Fallingborg, 1999), and maybe people have thought that the pH of bile will be equivalent to the pH of the intraluminal fluid. It's just not the case. There can be a tendency to rely on 30- and 40-year-old data or research in some of these areas, and that tendency can become problematic, in my opinion. I should mention that, in many disease states, such as in people with liver disease, the jejunal pH can be substantially lower than those median or mean values, measured in apparently healthy people and can decrease progressively throughout the day. The pH-sensitivities of something like an enteric coating should obviously, in my opinion, be engineered so as to allow dissolution at the lower range of intraluminal pH values for anyone. This would not be difficult to do, but it's not something that many manufacturers or other people seem to be aware of the need for (if enteric coatings are still going to be used). Here are some articles reporting gastric or intestinal injuries (i.e. obstruction of the pyloric sphincter or intestinal obstructions/strictures) from poorly-formulated enteric-coated tablets (this poor formulation extends to more or less all enteric-coated tablets, in my opinion, when one looks at the data on the pH-dependences of the polymers used in the coatings) [Harris, 1973: (http://www.ncbi.nlm.nih.gov/pubmed/4764749); Sogge et al., 1977: (http://www.ncbi.nlm.nih.gov/pubmed/22308); Davies, 1999: (http://www.ualberta.ca/~csps/JPPS2(1)/N.Davies/NSAID.htm)(http://www.ncbi.nlm.nih.gov/pubmed/10951657); Sherry, 1979: (http://www.ncbi.nlm.nih.gov/pubmed/287936); (http://scholar.google.com/scholar?num=50&hl=en&lr=&safe=off&q=%22enteric+coated%22+stricture+OR+obstruction)]. Obviously, non-enteric-coated aspirin could cause damage to the stomach or small intestine for other reasons, and one should always talk to one's doctor before making any change in any medication. The benefits of enteric-coated preparations may outweigh any potential problems with the preparations, for many people in many specific disease states. But my point is to show the many problems that exist with the approach, in a functional sense, and with many of the individual preparations, in my opinion.

When researchers refer to a "pyloric obstruction" from an enteric-coated aspirin tablet, the researchers mean that the tablet become "stuck" in the valve-like muscle that opens, periodically, to allow food to pass from the stomach into the duodenum. In some cases, minor or not-so-minor surgical procedures are required to remove these obstructions from the undissolved tablets.

Sunday, April 19, 2009

Magnesium Status and DNA Damage and Repair; Vitamin D in Phosphate and Magnesium Homeostasis

This article [Hartwig, 2001: (http://www.curesnaturally.com/Articles/Science/3MgMutagenesis.pdf)
(http://www.ncbi.nlm.nih.gov/pubmed/11295157)] is interesting, and the author discusses research showing that depletion of intracellular magnesium may, by a variety of mechanisms, increase the likelihood that DNA damage will occur in the cell. Hartwig (2001) discusses the fact that magnesium binds to DNA polymerase enzymes and is evidently serves a "preferred" role as a cofactor or regulatory cation in that regard. I'm not clear on what the mechanism is, because it sounds as if magnesium is just complexed with the phosphate groups of the nucleotide triphosphates that are serving as substrates for the DNA polymerases. I guess that's not the case, but the article doesn't explain the mechanism in a way that's especially clear. Magnesium or one of a variety of transition metals, such as manganese or nickel, can evidently stabilize the transition states of the polymerization reactions catalyzed by the polymerase enzymes, but the substitute cations (the transition metals) are poor substitutes and lead to increases in errors, thereby potentially resulting in damage, during DNA replication. The intracellular magnesium content can also affect the activities of DNA excision repair enzymes, as the author discusses.

That's a really original and interesting article, even if the mechanisms underlying those effects of magnesium are still not especially clear. I think there is something to it, though, and the physiological and pharmacological effects of magnesium are really complicated. The articles that have cited that article by Hartwig (2001) probably go into some of the mechanisms, but I don't feel like looking through them right now [(http://scholar.google.com/scholar?num=50&hl=en&lr=&safe=off&cites=15766567627687007705)]. Dietary vitamin D can increase magnesium absorption [Miller et al., 1965: (http://jn.nutrition.org/cgi/reprint/85/3/255.pdf); Hardwick et al., 1991: (http://jn.nutrition.org/cgi/reprint/121/1/13.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/1992050)], but there's some evidence that an increase in vitamin D intake may simultaneously increases the urinary excretion of magnesium (Hardwick et al., 1991). Nonetheless, Montgomery et al. (2004) [Montgomery et al., 2004: (http://www.animal-science.org/cgi/content/full/82/9/2742) (http://www.ncbi.nlm.nih.gov/pubmed/15452920?dopt=Abstract)] found that supplemental vitamin D3 did increase the intracellular phosphorus, magnesium, and calcium concentrations, expressed on a per-mg-cellular-protein basis, in the muscles of steers, and Miller et al. (1965), cited above, did find that an adequate vitamin D intake produced a net increase in magnesium retention. There are other articles that show graded increases in magnesium absorption with increasing serum 25-hydroxyvitamin D [25(OH)D] levels, but I don't feel like finding them now.

The capacity of an adequate vitamin D intake to prevent urinary phosphate loss might also be an important factor that could interact with the effects of magnesium on carbohydrate metabolism and nucleotide metabolism. Weisinger and Bellorin-Font (1998) [Weisinger and Bellorin-Font, 1998: (http://ca.geocities.com/mim_nephro/LancetMgP.pdf)
(http://www.ncbi.nlm.nih.gov/pubmed/9717944)] discuss research showing that an adequate supply of vitamin D helps to prevent hypophosphatemia by decreasing parathyroid hormone (PTH) levels. Kalaitzidis et al. (2005) found that low levels of serum phosphate and magnesium were associated with insulin resistance and other features of the so-called "metabolic syndrome" [Kalaitzidis et al., 2005: (http://www.nephrology.uoi.gr/uliko/Dimosievseis_engl_pdf/176.pdf)
(http://www.ncbi.nlm.nih.gov/pubmed/15861350)]. The authors discuss the fact that phosphate availability is important in carbohydrate metabolism, and magnesium obviously has many effects on carbohydrate metabolism.

The fact that an adequate supply of vitamin D can both lower serum PTH levels and increase or maintain intracellular phosphate concentrations is very significant, in my opinion, in part because of the "fragile" qualities of phosphate homeostasis. Supplementing with phosphate tends to be very problematic, unless there is some specific cause, such as renal failure, that makes supplementation, under a doctor's supervision, a valid approach. An increase in phosphate intake tends to reliably increase PTH levels, and the increased levels of PTH simply increase urinary phosphate excretion and can exacerbate acid-base abnormalities. For example, Ambuhl et al. (1999), cited below, found that serum bicarbonate levels increased progressively in response to phosphate supplementation. That would be an undesirable effect, in my opinion, for numerous reasons. Although Ambuhl et al. (1999) did not find a significant decrease in serum calcium in response to phosphate supplementation, other researchers have found that increases in phosphate intake can decrease serum calcium. In past postings [(http://hardcorephysiologyfun.blogspot.com/2009/02/potential-for-hypophosphatemia-or.html); (http://hardcorephysiologyfun.blogspot.com/2009/01/calcium-magnesium-serum-calcium-vitamin.html)], I've suggested that one approach might be to obtain a reasonable amount of calcium from foods (i.e. a total intake of 500-1000 mg/d or something or to just not get enormous amounts of calcium from foods, as so many people are advocating these days), to not supplement with calcium, and to supplement with vitamin D and magnesium. That's just my opinion, and one should obviously talk about that with his or her doctor. Apart from the potential for calcium-induced hypercoagulability at high dietary intake ratios of Ca2+/Mg2+ (http://hardcorephysiologyfun.blogspot.com/2009/01/calcium-magnesium-serum-calcium-vitamin.html), Lau et al. (1984) [Lau et al., 1984: (http://www.ncbi.nlm.nih.gov/pubmed/6703070)] found that a high dietary intake of calcium strongly reduced phosphate absorption, decreased serum phosphate, and increased magnesium excretion in the urine to a pathological degree. The binding of calcium and phosphate in the G.I. tract is well-known. Given that both calcium and vitamin D decrease serum PTH levels and increase serum calcium levels slightly (and given that higher vitamin D intakes would be expected to increase or maintain serum or tissue phosphate and magnesium levels, while high dietary calcium intakes could be expected to decrease serum phosphate and magnesium levels), an increase in dietary vitamin D to an adequate level seems to me to be a better approach, in comparison to these excessive increases in dietary calcium, to decreasing serum PTH levels and maintaining serum calcium levels in the long term. There's considerable evidence that an increase in dietary vitamin D decreases serum PTH, in part or even largely, by leading to an increase in the localized production of hormonal vitamin D, 1alpha,25-dihydroxyvitamin D, from 25-hydroxyvitamin D in the parathyroid gland [Vieth et al., 2003: (http://jcem.endojournals.org/cgi/content/full/88/1/185) (http://www.ncbi.nlm.nih.gov/pubmed/12519850?dopt=Abstract)]. Magnesium deficiency is thought to produce resistance to the various signals that regulate PTH output by the parathyroid gland, in part by reducing adenylate cyclase activity in the parathyroid gland (Weisinger and Bellorin-Font, 1998). But that's unlikely to be a mechanism mediating the PTH-suppressing effects of increases in vitamin D intake, in my opinion. The main message of the article by Vieth et al. (2003) is that, past a certain level of 25-hydroxyvitamin D, the levels of serum hormonal vitamin D [1alpha,25(OH)2D] will plateau or actually decrease. As the 25(OH)D levels increase more, the suppression of PTH levels will continue to increase, as a result of the autocrine or paracrine conversion of 25(OH)D into hormonal vitamin D in the parathyroid glands. That capacity of vitamin D to decrease PTH levels without also decreasing serum phosphate levels is very unusual and important, in my opinion, and it's not an effect that can be reproduced by taking massive amounts of calcium, in my opinion.

As long as one recognizes that this article [Ambuhl et al., 1999: (http://www.ncbi.nlm.nih.gov/pubmed/10561144)] is not meant to suggest that phosphate supplementation in normal people is a good idea, the results are very important, in my opinion. Ambuhl et al. (1999) found that restoring normal phosphate levels in people with renal failure increased the concentrations of ATP and other high-energy phosphate compounds in the muscles of the people whose phosphate homeostasis had been normalized. The really interesting thing is the discussion by the authors of the way in which the tissue phosphate levels tend to not correlate very reliably with the serum phosphate levels. Hypophosphatemia and hyperphosphatemia are both very dangerous, and the authors were not really saying that serum phosphate measurements are invalid or unimportant. But the authors were saying that, within the normal range of serum phosphate values, the tissue phosphate concentrations can vary considerably. In support of that concept, Ambuhl et al. (1999) found that the mean serum phosphate levels were within the normal range in both the control group and the phosphate-supplemented group by the end of the study period.

The point of that article is that the prevention of excessive urinary phosphate excretion and maintenance of serum phosphate levels can have significant effects. The association of normal or higher serum phosphate levels with higher insulin sensitivity is really important, in my opinion. Although the correction of severe magnesium deficiency can increase serum PTH and serum calcium levels initially, this is not a graded phenomenon and represents a restoration of the normal homeostatic regulatory mechanisms (Weisinger and Bellorin-Font, 1998). I mention that because it's mentioned frequently in the literature and isn't always put in the proper context.

Saturday, April 18, 2009

Protection Against Ischemic Damage by Free Fatty Acids in Diabetic Rats: Potential Relevance to Cholesterol Formation & Energy Metabolism in the Brain

This article [King et al., 2001: (http://ajpheart.physiology.org/cgi/content/full/280/3/H1173)(http://www.ncbi.nlm.nih.gov/pubmed/11179061?dopt=Abstract)] is interesting, and the authors found that exogenous palmitate, a saturated free fatty acid, was able to protect against damage to the hearts of diabetic rats during partial ischemia. King et al. (2001) also found that exogenous ketones were not protective. This type of research is consistent with one interpretation, as discussed previously (http://hardcorephysiologyfun.blogspot.com/2009/04/low-cholesterol-levels-and-risk-of.html), of research showing that very low cholesterol levels have been associated with higher incidences of intracranial hemorrhages/hemorrhagic strokes and deaths by suicide. King et al. (2001) found that exogenous palmitate had exerted an inhibitory influence on the overall rate of glycolysis, during ischemia, but had nonetheless been oxidized, even during ischemia, so as to produce a net augmentation of ATP production. The authors also noted that palmitate may have produced even more inhibition of the pyruvate dehydrogenase (PDH) multienzyme complex than had already been produced, in the diabetic rat hearts, from the effects of diabetes per se. The authors, additionally, discussed the fact that ketone-metabolizing enzymes are likely to be inhibited during ischemia, particularly in cardiac myocytes of diabetic rats, for various reasons. The ischemia-associated (and streptozotocin-induced) inhibition of ketone utilization may have contributed to the failure of exogenous beta-hydroxybutyrate to provide protection against ischemic/postischemic injuries (King et al., 2001).

Another specific factor the authors were getting at is that an elevation of lactate, which the authors found had occurred during ischemia, especially in the diabetic rats' hearts, tends to elevate the cytosolic NADH/NAD+ ratio, and this can, in cells with diminished oxidative capacity, lead to a decrease in the intramitochondrial NAD+/NADH ratio and inhibit enzymes whose activities are sensitive to changes in the NAD+/NADH ratio. This "translation" of the cytosolic redox state into the mitochondria occurs via the transport of substrates, such as aspartate and malate, into and out of the mitochondria, such as through the transporters that comprise the malate-aspartate shuttle. King et al. (2001) discuss research showing that the oxidation of lactose in cells from diabetic animals and humans is inhibited in a "specific" manner.

One thing that the authors may be getting at is that elevations in lactate and the lactate/pyruvate ratio, such as occur during the activation of anaerobic glycolysis during ischemia, can lead to the inhibition of the TCA cycle. An elevation in the lactate/pyruvate ratio is accompanied by an elevation in the NADH/NAD+ ratio and can, via the exchange of substrates across the outer mitochondrial membrane, decrease the intramitochondrial NAD+/NADH ratio and thereby inhibit the activities of TCA cycle enzymes. By inhibiting glycolysis, palmitate and, as the authors mention, hexanoate, may be essentially buffering the cytosolic NADH/NAD+ ratio. The authors also noted that the overall activity of the pyruvate dehydrogenase (PDHC) complex tends to be lower in the cardiac cells (and other cell types) of rats and humans with diabetes and that, in the face of a limited capacity of the cells to oxidize glucose, the exogenous palmitate had at least served as a substrate that the cells could oxidize. In other words, the PDHC activity would ideally have been higher and have allowed the cells to use glucose instead of palmitate, but the cells could only oxidize so much glucose and metabolize so much glucose, by glycolysis, without deranging the cytosolic redox state. In a context, such as the ischemic, diabetic heart, in which glucose metabolism is necessarily going to be deranged, fatty acid oxidation can evidently ameliorate the postischemic damage. King et al. (2001) also noted that other researchers had previously found exacerbations in postischemic injuries in response to exogenous palmitate, in part, as suggested by the authors, because other researchers had typically used total ischemia in their animal models. Total ischemia obviously prevents any oxidative metabolism from proceeding and prevents the utilization of palmitate or other saturated FFAs. Lloyd et al. (2004) found that the oxidative metabolism of many substrates, including palmitate, can make an important contribution to ATP production during ischemia that is even relatively severe [Lloyd et al., 2004: (http://ajpheart.physiology.org/cgi/content/full/287/1/H351)(http://www.ncbi.nlm.nih.gov/pubmed/15001444?dopt=Abstract)].

These articles [Bastiaanse et al., 1997: (http://cardiovascres.oxfordjournals.org/cgi/content/full/33/2/272)(http://www.ncbi.nlm.nih.gov/pubmed/9074689?dopt=Abstract); Bastiaanse et al., 1994: (http://www.ncbi.nlm.nih.gov/pubmed/8072018); Vauthey et al., 2000: (http://www.ncbi.nlm.nih.gov/pubmed/10822434); Olsen et al., 2007: (http://www.ncbi.nlm.nih.gov/pubmed/17761907); Zuliani et al., 2004: (http://www.ncbi.nlm.nih.gov/pubmed/15031316)] show that higher serum cholesterol levels or cell-membrane cholesterol levels (sarcolemmal cholesterol is plasma membrane cholesterol in cardiac myocytes) are associated with improvements in the tolerance of the cultured cells or various cell types, subjected to ischemia in vivo, to anoxia or ischemia. Three of those articles provide evidence that higher serum cholesterol levels are associated with the occurrence of less severe strokes and with decreases in the risk of post-stroke death. It's important to note that those studies do not evidently show that higher cholesterol levels reduce the risk of stroke, and it may be the case, in my opinion, that a person with "higher" cholesterol levels could have "more strokes" but have "less severe strokes," etc. The articles don't really show, in my opinion, that cholesterol is "good," and Bastiaanse et al. (1997) discuss research showing that higher plasma membrane cholesterol concentrations, in smooth muscle cells, can increase calcium influx. That would be undesirable both under "baseline," day-to-day conditions and during a stroke. Bastiaanse et al. (1997) also discuss research showing that plasma membrane cholesterol is degraded en masse, during ischemia, and that some of that cholesterol that isn't degraded is transported to the mitochondrial membranes, etc. It sounds as if some of it is, in fact, degraded to propionate and oxidized, in my opinion.

Cholesterol degradation can contribute to the propionate pool, and propionate can serve, via its metabolism into succinyl-CoA, as an anaplerotic substrate, but I'm not sure if the amounts of cholesterol-derived propionate would be large enough to contribute meaningfully to ATP production. The oxidation of cholesterol-derived propionate, a saturated, odd-chain "free fatty acid" (the term does not really apply to fatty acids that are formed intracellularly and oxidized in an autocrine manner), could conceivably help to maintain ATP production during partial ischemia. Again, I'm not sure if a significant amount would be formed from cholesterol. My guess is that it would, but I can't provide quantitative support of that supposition.

Another possibility is that the membrane cholesterol content exerts some regulatory effect on AMPK expression or on energy metabolism, etc. A lot of the research on that type of regulation has been centered around the feedback suppression of HMG-CoA reductase activity by cholesterol itself, in the liver. One reason I suggested that increases, past some critical level, in saturated FFAs as a factor that could mediate protection against hemorrhage or "suicidal-depression-associated" impairments in astrocyte energy metabolism is that most of the cholesterol in the brain is thought to be made in situ, or locally in the brain. It's conceivable that there's some combination of an impairment in the oxidation of fatty acids, derived from the blood, and intracellular fatty acid synthesis, from glucose or glutamine or other substrates, in astrocytes or cerebral capillary endothelial cells, in people who become suicidally depressed. These are just my opinions and thoughts, but it's interesting that, for example, glutamine can serve as an energy substrate for astrocytes, a role that tends to be accompanied by the transient or partial inhibition of lipolysis and beta-oxidation of FFAs, a "lipogenic" substrate, and a factor that has been shown to increase the oxidation of fatty acids [Iwashita et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16517950); (http://hardcorephysiologyfun.blogspot.com/2009/02/glutamine-decreases-plasma-free-fatty.html)]. A pronounced decrease in plasma FFAs might produce more of a decrease in brain cholesterol levels than a reduction in plasma cholesterol per se would, in my opinion. My current sense of the relationships is that a higher saturated fat intake will not especially reliably increase total serum cholesterol levels (via an increase in LDL cholesterol levels) in people with extremely low cholesterol levels, even though high saturated fat intakes do seem to be associated with higher total cholesterol levels. I would think that glucocorticoid resistance, in the context of chronic stress and depression, would impair the regulation of energy metabolism, via AMPK phosphorylation or dephosphorylation, for example, and confound a lot of these attempts to apply experimental results from nondepressed people to an understanding of the physiology at work in suicidal depression. But the presence of high cholesterol levels, from any cause, seems to be associated with higher FFA levels. Plasma ketone levels do not seem to reliably correlate with cholesterol levels, as shown in this article [Fukuda et al., 1991: (http://www.ncbi.nlm.nih.gov/pubmed/1897904)]. That's not a good example, but my point is that some kind of predictable relationship between plasma FFA levels and total cholesterol may be absent among severely depressed people, across a lower range of cholesterol levels, but may be present among people who are at the upper ranges of total cholesterol levels. And the extents to which FFAs are available for utilization (referring to the rate of uptake into the brain) or are utilized as energy substrates (referring to the rate at which they are oxidized) or cholesterol precursors by, for example, astrocytes, may not show reliable relationships with changes in the plasma FFA level.

Given the associations of depression with cardiovascular disease, it's conceivable to me that impairments in energy metabolism in the brain or cerebral vascular endothelial cells could impair the utilization of blood-borne FFAs, and so the issue might be as much about the rate of utilization of FFAs by astrocytes as it might be about the rate of uptake of FFAs into the brain. Even in the face of this poor utilization, it's conceivable that ATP production in astrocytes or endothelial cells could be very sensitive to small changes in plasma FFAs. I can't think or read about this topic any more right now. Obviously, these are only my opinions, and I'm not attempting to view any of this research in the context of any particular value system. Maybe the endothelial cells metabolize FFAs poorly in people who experience depression and allow saturated acyl-CoAs to accumulate, thereby interfering with oxidative metabolism and contributing to atherosclerosis. This accumulation could simultaneously restrict the transport of FFAs and other substrates into the brain, thereby leading to reductions in ATP production or membrane cholesterol biosynthesis in situ, etc. I think it's also likely to be important to differentiate, in research on these topics and associations, between people who have severe depression and may be suicidally depressed and those who have less severe depression. In any case, it's not a pleasant topic to think about or discuss.

Thursday, April 16, 2009

The Mesolimbic Reward Pathways in Depression

The authors of this article [Nestler and Carlezon, 2006: (http://www3.utsouthwestern.edu/contecenter/refs/Nestler2.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/16566899)] discuss the potentially-important roles that dysregulations of the mesolimbic and mesocortical, dopaminergic pathways may play in the etiology of major depression. Carlezon is a researcher at Harvard Medical School and has contributed to a lot of really interesting research, such as the research showing antidepressant effects of uridine or cytidine in animal models of depression [Carlezon et al., 2002; Carlezon et al., 2005, both cited and discussed here: (http://hardcorephysiologyfun.blogspot.com/2009/02/potential-psychiatric-pitfalls-in.html)]. Nestler and Carlezon (2006) note that the tendency has been to focus on changes in the hippocampus or prefrontal cortex in depression, etc. It's stunning to me that someone had to write an article like this. Of course one would expect, in my opinion, the mesolimbic reward pathways to be dysregulated in people with depression. It's also noteworthy that some hippocampal neurons project to the prefrontal cortex and are thought to be important in the regulation of dopaminergically-mediated working memory performance. For example, Seamans et al. (1998) [Seamans et al., 1998: (http://www.jneurosci.org/cgi/content/full/18/4/1613)(http://www.ncbi.nlm.nih.gov/pubmed/9454866?dopt=Abstract)] discussed the way in which dopaminergic neurons, originating in either the ventral tegmental area (VTA) or nucleus accumbens or other sites in the striatum [Carr and Sesack, 2000, cited here: (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html)], project to and form synapses with GABAergic interneurons in the prefrontal cortex and are thought, via the activation of D1 dopamine receptors on those GABAergic interneurons, to exert a largely inhibitory influence on the burst firing patterns of glutamatergic pyramidal neurons that originate in the prefrontal cortex and provide monosynaptic inputs back to the dopaminergic neurons in the VTA, in the midbrain. The "regulatory" effects of glutamatergic inputs, originating in the prefrontal cortex, to the VTA are important for maintaining the normal burst firing patterns of VTA neurons, and the burst firing patterns of dopaminergic neurons in the VTA are crucially important for processes such as reward-based learning, cognitive functioning, and motivation, etc. Dopaminergic neurons in the VTA can also produce excitatory or inhibitory effects on glutamatergic neurons in the prefrontal cortex by modifying the excitatory or inhibitory effects of hippocampal neurons that provide inputs to the same classes of GABAergic interneurons that regulate the layer V glutamatergic neurons that project back to the VTA. In any case, those are some ways the reward pathways interact with the hippocampus and prefrontal cortex, and many of those interactions would, in my opinion, be relevant to an understanding of dysregulations of the reward pathways in people with major depression.

Wednesday, April 15, 2009

Elevation of Plasma Hypoxanthine During and After Glycogen-Depleting Exercise: Relevance to Energy Metabolism and Nucleotide Turnover in the Brain

This is a really interesting article [Sahlin et al., 1999: (http://www.ncbi.nlm.nih.gov/pubmed/10502075)], and the authors found that plasma hypoxanthine increased by a mean of 7.78-fold during exercise in humans. There was some variation in the magnitudes of the increases in plasma hypoxanthine among individuals. The increases in plasma uric acid (urate) and xanthine, which is derived from the metabolism of hypoxanthine by xanthine oxidoreductase (xanthine oxidase is technically a modified form of xanthine reductase that's been modified by proteolytic cleavage, etc., but people typically refer to the enzyme activity as being "xanthine oxidase" activity), were larger in terms of the amounts of those purines formed, but the percent increases were lower (mean increases of 5 percent for urate and 223%, or a 2.23-fold increase, for xanthine). The effect of that increase in plasma hypoxanthine on the brain should not be underestimated. In a past posting, I discussed some of the research showing neuroprotective effects of remarkably low doses of hypoxanthine [Mink and Johnston, 2007, cited and discussed here: (http://hardcorephysiologyfun.blogspot.com/2009/03/protection-against-postischemic-damage.html)]. There's still the fact that the exercise-induced increases in neuronal activity in the brain would be expected to substantially, albeit transiently, increase the rates of purine export from neurons. I can't immediately find any articles showing elevated cerebrospinal fluid hypoxanthine and xanthine and urate levels, following exercise, but it's very likely that those elevations would occur, in my opinion, especially following high-intensity exercise. So that supposed depleting effect of exercise on neuronal and astrocyte purine nucleotide levels would be expected to lessen the impact of an increase in the plasma hypoxanthine level. I would think that, as a person's muscle mass increased over time, the muscles' capacity to export hypoxanthine would be increased and would produce more significant effects on the brain. But the person would, in my opinion, keep having to push the limits and produce glycogen depletion, as discussed below. Other researchers have discussed the effects of muscle-derived purines on the brain. There's one article from 1978 or 1979 that discusses the potential effects of exercise-induced elevations in extracellular ATP, released from endothelial cells, on the brain, but the authors focused too much on ATP per se. Extracellular ATP is rapidly degraded to ADP and then adenosine and hypoxanthine, etc., and so one would expect to see much more of an effect of exercise on adenosine or hypoxanthine than on ATP. I mean that the purines that might be expected to enter the brain in significant amounts, during exercise, and to also exert meaningful effects on nucleotide pools in neurons or astrocytes would be hypoxanthine or, conceivably, adenosine and not ATP, in my opinion. Xanthine is not salvaged efficiently (but can, in fact, be salvaged in small amounts to xanthosine and then guanosine), but hypoxanthine and adenosine are salvaged relatively efficiently by cells in the brain.

Sahlin et al. (1999) discuss the fact that researchers have generally found glycogen depletion from the skeletal muscles to be a prerequisite for the most pronounced, exercise-induced increases in plasma purines, including hypoxanthine. True glycogen depletion from a muscle group generally requires exhaustive exercise, and the research has generally shown, in my opinion, that resistance exercise produces more-pronounced degrees of glycogen depletion and purine depletion from the muscles [the general idea is that ATP depletion causes a loss of the capacity to salvage adenosine (and also guanosine), and this causes inosine monophosphate to accumulate and be converted into hypoxanthine and xanthine in the muscle cells] than exercise at low intensity does. Hellsten et al. (1998) [Hellsten et al., 1998: (http://ajpendo.physiology.org/cgi/content/full/274/4/E600)(http://www.ncbi.nlm.nih.gov/pubmed/9575819)] discuss research showing that a high-intensity exercise program causes a 20 percent decrease in the total adenine nucleotide contents of skeletal muscles (when people are not exercising, meaning post-exercise and all the time). That's a remarkable fact and suggests to me that some benefit might be derived from low-dose adenosine or guanosine supplementation, particularly early in an exercise program. But that's my opinion. The effects of hypoxanthine on the brain should not be underestimated, and a single exercise session that produces an 8-fold elevation in plasma hypoxanthine levels, during the hour or few hours following exercise, could have a significant effect on the pools of adenine nucleotides, in particular, in the brain. Mink and Johnston (2007), cited above, discuss the fact that hypoxanthine appeared to be salvaged to a large extent, even during ischemia. The brain has an extremely low capacity to make purines de novo and depends almost entirely on purines exported from either the endothelial cells lining the cerebral blood vessels or from the blood.

Of course, creatine and glutamine have been shown to augment the salvage of purine or pyrimidine nucleotides in various articles [cited and discussed here or in other postings: (http://hardcorephysiologyfun.blogspot.com/2009/02/interactions-of-glutamine-and-arginine.html)], but any supposed improvement in purine or pyrimidine salvage, in the brain, that might occur in response to the administration of those types of supplements might be offset by the increase in, for example, purine nucleotide export that could accompany a creatine- or glutamine-induced increase in exercise intensity. I don't think that's the way it would work, though, as long as the doses of creatine or glutamine are kept low, but that's just my opinion. For example, the combination of glutamine and inosine, which is hypoxanthine riboside (hypoxanthine attached to ribose to make a nucleotide), [Hodges and Snyder, 2004, cited here: (http://hardcorephysiologyfun.blogspot.com/2009/02/contribution-of-glutamine-to-pool-of.html)]. That's just my opinion. In my opinion, low doses of glutamine, creatine and adenosine and guanosine might produce some sort of buffering effect on brain energy metabolism and purine nucleotide pools. Creatine increases or "stimulates" oxidative metabolism, in part by maintaining the intramitochondrial ADP pool, and this would be expected to increase the extent to which glutamine, upon its metabolism into glutamate, can be metabolized into alpha-ketoglutarate and undergo oxidative metabolism in neurons or astrocytes in the brain. People discuss creatine as if it participates only in anaerobic metabolism, but this is just not the case. Creatine can prolong "aerobic" exercise and is known to increase oxidative glucose utilization in cells, etc.

Similarly, cytidine was recently shown to decrease the glutamine+glutamate pool in parts of the brains of humans [Yoon et al., 2009: (http://www.ncbi.nlm.nih.gov/pubmed/19194376)], and that type of effect could be explained in terms of an increase in the flux of substrates through the nonoxidative pentose cycle, via the ribose-1-phosphate derived from cytidine catabolism in the brain, and glycolytic pathways. Uridine is converted into cytidine, and ribose derived from purines might be expected to produce a similar effect, to some extent. Hodges and Snyder (2004), in fact, discuss research showing that either glucose or glutamine can serve as an energy substrate for cultured cells but that inosine or some other "pentose source," meaning ribose derived from inosine, had to be present, in combination with glutamine, to maintain ATP levels in cultured cells that lacked a source of glucose. Exogenous nucleotides have been shown to elevate lactate levels in many articles, and those effects could be partly attributable to the ribose-induced increases in the activities of glycolytic enzymes. The effect of cytidine could also have been the result of some effect on receptors that bind pyrimidine triphosphates, etc., but that type of mechanism, in my opinion, is less plausible. Nonetheless, I don't think ribose, by itself would substitute for uridine, past a certain point.

My overall point is that some of these energy substrates, such as creatine and glutamine and ribose, can increase the flux of intermediates through one pathway and produce beneficial effects that might, in my opinion, be expected to be limited by the "depleting" effect that might result from that shift in the metabolism of one or another intermediates. At high doses, however, these types of supplements start to produce really complicated effects. The glutamine-induced decreases in plasma free fatty acids, for example, could, in my opinion, begin to become counterproductive, with respect to the brain, at higher dosages. Obviously, one would want to discuss any of these supplements with one's doctor before taking any of them or beginning any kind of exercise program.

Tuesday, April 14, 2009

Free Fatty Acid Transport and Metabolism in Endothelial Cells: Localized Increases in Utilization and Effects of AMPK Activation

This article [Hamilton and Kamp, 1999: (http://diabetes.diabetesjournals.org/cgi/reprint/48/12/2255)(http://www.ncbi.nlm.nih.gov/pubmed/10580412?dopt=Abstract)] is really terrific, and the authors discuss all of the unanswered questions about the transport and oxidation of free fatty acids (FFAs) by endothelial cells and other cell types. The authors point out that FFAs may cross plasma membranes by the passive, flip-flop "mechanism" and not just by active transport. The authors also discuss the fact that localized increases in the generation of and utilization (oxidation) of FFAs, by endothelial cells and other cell types, can seriously confound attempts to view FFA metabolism in terms of strict rules. Dagher et al. (2001) [Dagher et al., 2001: (http://circres.ahajournals.org/cgi/content/full/88/12/1276?ijkey=6528ccf131742699838075ec892dca9444f4806b)(http://www.ncbi.nlm.nih.gov/pubmed/11420304?dopt=Abstract)] found that the activation (phosphorylation) of AMP-activated protein kinase (AMPK), by AICAR, in the presence of carnitine, increased the utilization of fatty acids as energy substrates substantially. Dagher et al. (2001) also found that some of the fatty acids were being generated from intracellular pools, a fact that could lead to erroneous conclusions in the context of research focusing exclusively on changes in plasma FFA. Hamilton and Kamp (1999) note that the estimates of the solubility of palmitate have varied 100,000-fold (p. 2257) (0.1 nM vs. 10,000 nM at pH 7.4), and the movements of FFAs in and out of membranes is extremely dynamic. AMPK activation is likely to occur in the context of insulin resistance, given that glucose availability is obviously a major factor that maintains the ATP/AMP ratio high and prevents AMPK activation in cells. But AMPK activation is regarded as a "good" or "desirable" strategy for increasing glucose transport. So insulin resistance impairs ATP production and can thereby activate AMPK and increase glucose transport, and some treatments for diabetes are thought to increase glucose transport mainly by providing an extra stimulus for AMPK activation (they augment AMPK activation). But what about the effect of AMPK activation on FFA metabolism? I'm not saying that fatty acids are very effective energy substrates, but the endless focus on AMPK activation as a supposedly-effective strategy for restoring glucose transport has never made sense to me. That's just my opinion. I've discussed the fact that the activation of the PGC-1alpha-mediated increase in mitochondrial proliferation ("mitochondrial biogenesis"), in response to AMPK activation, is unlikely, in my opinion, to be beneficial and is likely to be maladaptive. I would say that, in the majority of contexts in which I've seen mitochondrial proliferation discussed, in the context of mitochondrial disorders or other contexts, the mitochondrial proliferation has been viewed as being maladaptive and evidence of pathology. But Davidson and Duchen, 2007 [cited and discussed here: (http://hardcorephysiologyfun.blogspot.com/2009/01/mitochondrial-activity-in-endothelial.html)] note that mitochondria in endothelial cells have generally been viewed as being either "nonexistent" or unimportant. But the activities of the mitochondrial tricarboxylic acid cycle enzymes is an important factor that helps to maintain glycolysis, by indirectly maintaining the transport activities of, for example, the malate-aspartate shuttle, etc. In any case, there are some issues with research on fatty acid metabolism.

Sunday, April 12, 2009

Low Cholesterol Levels and Risk of Suicide or Cerebral Hemorrhage: Potential Relevance of Saturated Free Fatty Acids as Energy Substrates

There's a considerable amount of research that shows an association of very low plasma total cholesterol levels with an increase in the risk of death by suicide (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=cholesterol+suicide) or, disturbingly, by murder ("violent death") (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=cholesterol+%22violent+death%22). This is a disturbing topic, and there's also quite a bit of research that associates very low plasma cholesterol levels with an increase in the risk of hemorrhagic stroke (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=cholesterol+hemorrhage). The arguments that researchers have made, in their attempts to explain these associations, have not been very compelling to me, for the most part. Many people seem to have bought in to the idea that low omega-3 fatty acid intake was some sort of "surrogate marker" for low plasma cholesterol and that the apparent increases in suicidality, in the context of low cholesterol levels, are actually a result of low omega-3 intakes. This makes no sense, in my opinion, and it's never made any sense to me. That explanation for the association of low cholesterol with suicide would, in fact, directly conflict with other research that relates to the association of low cholesterol with hemorrhage. Specifically, researchers have found a positive association of high omega-3 fatty acid intakes with risk of hemorrhage (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22omega-3%22+hemorrhage), and researchers have also found that low saturated fat intakes are associated with an increased risk of hemorrhage (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22saturated+fat%22+hemorrhage). (Note that researchers sometimes use the term "intraparenchymal hemorrhage" to refer to an intracranial hemorrhage into the central nervous system parenchyma, as opposed to the hepatic, or liver, parenchyma, etc.)

I think all of these associations could be explained in terms of decreases in saturated free fatty acid availability to the brain and endothelial cells lining the blood vessels supplying the brain. Saturated free fatty acids behave quite differently from the ways unsaturated fatty acids behave, and there's considerable research showing that elevations in free fatty acids, following exercise, contribute strongly to glycogen replenishment in the liver and skeletal muscles. High-intensity exercise can elevate fasting plasma free fatty acids for four or more days post-exercise. Kiens et al. (1998) [Kiens et al., 1998: (http://ajpendo.physiology.org/cgi/content/full/275/2/E332)(http://www.ncbi.nlm.nih.gov/pubmed/9688636)] found, similarly, that fasting FFAs (note that the FFA measurements in the evening, at 30 hours post-exercise, was approximately a fasting measurement but might have been expected to be even higher in the true, fasted state, meaning in the morning) were still elevated at 42 hours post-exercise. And strength training, performed correctly, as a form of high-intensity exercise, generally produces the greatest elevations in FFAs. There's increasing evidence that impairments in cellular energy metabolism may be at the root of some psychiatric conditions, such as depression, and astrocytes and endothelial cells are capable of oxidizing palmitate and other free fatty acids, in addition to ketones, as energy substrates. Additionally, there is research showing that fasting is beneficial to animals with traumatic brain injuries (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22Fasting+Is+Neuroprotective+Following%22). Everyone assumes that this is because fasting can elevate ketones (acetoacetate and beta-hydroxybutyrate, primarily), and ketones have been shownt to exert neuroprotective effects in many articles (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=ischemia+%22beta-hydroxybutyrate%22+OR+acetoacetate). But my sense is that ketone levels do not reliably correlate with plasma cholesterol levels. I'm not sure if *saturated* free fatty acid levels correlate with plasma cholesterol levels, but my sense is that the proportion of saturated free fatty acids would increase as plasma cholesterol increased. Saturated fatty acids are preferentially used for cholesterol biosynthesis, in comparison to monounsaturated (and, obviously, omega-3) fatty acids. Also, fasting for even short periods of time elevates plasma free fatty acids, but I think ketones do not necessarily become elevated to very significant levels until one has fasted for a fairly prolonged period of time. My reason for not focusing on ketones is that I've seen articles showing very inconsistent relationships between serum ketone levels and serum cholesterol levels, but this does not mean that ketones would not be important as energy substrates for neurons and astrocytes. But even the overnight fast, during sleep, elevates free fatty acids substantially, an effect that is thought to partly be due to the nighttime growth hormone release. Nonetheless, high-intensity exercise elevates both FFAs and ketones [Walsh et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9562294)]. But the assumption in the research looking at associations of factors with plasma cholesterol did not, in general, assume that the people were exercising, etc. And increasing one's omega-3 fatty acid intake is thought to potentially increase ketone formation (by the liver, etc.), and that's not an effect one would expect to see if, as I'm assuming in the context of this discussion, an increase in omega-3 intake and a decrease in saturated fat intake were predisposing to hemorrhage [Freemantle et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16829066)]. The authors of many of the articles on fat intake have suggested that the supposed protective effect of saturated fat intake, in the context of the risk of hemorrhage, might have more to do with the ratio of saturated fat to omega-3 fats. In other words, a person who eats more saturated fats will, at a given caloric intake and dietary composition, tend to eat fewer grams of omega-3 fats and have a higher percentage of saturated fatty acids in erythrocyte membrane phospholipids than a person who eats fish three times a day would, etc. Other researchers have suggested that a higher saturated fat intake simply increases cholesterol levels and thereby produces the supposed protection against intracranial hemorrhage. Although saturated fats (palmitate, etc.) are generally thought to be utilized more for cholesterol biosynthesis than many other fatty acids, I don't think that small increments in saturated fatty acid intake (I think even something like an increment, or increase, of 18 grams per day of saturated fats was found to be associated with protection from hemorrhage in some articles, and that's not some kind of enormous intake level) would produce large and incremental increases in plasma cholesterol. That's my sense of it, at least. But increasing the ratio of dietary saturated fat to polyunsaturated fats would, in my opinion, be expected to increase the "percentages" of erythrocyte phospholipids containing saturated fatty "acyl" side chains and, by extension, in my opinion, increase the saturated fraction of the plasma FFA pool.

Additionally, one explanation for the association of low cholesterol with hemorrhage was that a low cellular cholesterol content in smooth muscle cells made those cells less resistant to anoxia or hypoxia. Why would this be? Perhaps, during hypoxia or ischemia, cholesterol is degraded more extensively to propionate (cholesterol is, in fact, a source of odd-chain fatty acids, such as propionate, that can be metabolized to succinyl-CoA and thereby serve as anaplerotic substrates). I can't find the article discussing that, but I remember it. The researchers were saying that cerebral ischemia can produce localized, smooth-muscle-cell necrosis ("arterionecrosis"), by diminishing blood flow to a section of an artery (although, one would think , in my opinion, that low cholesterol would be associated with venous hemorrhage, also, and not just or even mainly with arterial hemorrhages), and that an "adequate" smooth-muscle-cell cholesterol content would reduce that arterionecrosis. The localized, necrotic death of smooth muscle cells is one explanation for reperfusion-induced or reperfusion-associated hemorrhage. It's possible that there is not a great deal of validity to that "explanation" for the association of low cholesterol with hemorrhage (the researchers' concept that a low smooth muscle cell cholesterol content might decrease the resistance of those cells to hypoxia/anoxia), and my suggestion that a "cellular-cholesterol-content-induced" increase in propionate oxidation may also not be valid. But finding evidence that would contradict those concepts would not negate the possibility that increases in, for example, the "area under the curve" for saturated, plasma free fatty acids could, for example, be associated with less depression or with improvements in the maintenance of astrocyte glycogen contents at different times during the day or during recovery from exercise, etc. There can be a tendency to view all free fatty acids as being always bad, and one could argue that this tendency has more to do with value-laden dogma about physiological processes being "good" or "bad" ("good fats" vs. "bad fats," etc.). If all free fatty acids, including palmitate and other saturated free fatty acids, are bad for cellular energy metabolism all the time, then why can resistance exercise, which elevates plasma free fatty acids significantly for prolonged periods of time but simultaneously tends to improve insulin sensitivity (elevations in FFAs normally are associated with a worsening of insulin sensitivity) and increase the capacity of skeletal muscle cells and other cell types to oxidize those fatty acids, produce improvements in mood in some people [Doyne et al., 1987: (http://www.runningtherapie.nl/Portals/0/ccp555748.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/3454786)]? I think an important distinction needs to be made, from the standpoint of energy metabolism, between saturated free fatty acids and unsaturated free fatty acids (a.k.a. nonesterified fatty acids). Saturated fatty acids do not produce all of the same inhibitory effects, on the activities of enzymes and on the binding of ligands to receptors, as unsaturated fatty acids produce. In many cases, saturated fatty acids do not produce any of the less-than-desirable "regulatory" or toxic effects, at least in experiments performed in vitro, that unsaturated fatty acids produce.

I'm not saying that elevating saturated free fatty acids is likely to be a particularly good strategy in any particular disease context, but, for example, endothelial cells in some blood vessels are known to depend on either glutamine or free fatty acids, to a significant extent, as energy substrates, particularly in the fasted state. Also, I think that inappropriate elevations in saturated plasma free fatty acids could reasonably be expected to contribute to atherosclerotic disease, particularly if those elevations occurred outside of the context of something like resistance training or some other form of exercise. But if the role of saturated free fatty acids could be investigated in some of these contexts that I've discussed, researchers might be able to develop alternative substrates for maintaining cellular energy metabolism, such as in astrocytes and neurons, and avoid the problems associated with beta-oxidation of fatty acids in the cells of adults. Beta-oxidation is not a particularly efficient or "clean" process and tends to be problematic, from the standpoint of the flux of substrates through the tricarboxylic acid cycle, etc. (problematic from the standpoint of carbohydrate metabolism). Nonetheless, I would argue that there is a need to face some of these disturbing associations. To acknowledge that free fatty acids may influence astrocyte energy metabolism in some positive ways, as, in my opinion, they may, is not to say or imply that this is "good" or that people should eat massive amounts of saturated fats or try to elevate free fatty acids artificially, outside of the context of something like strength training/resistance exercise. Rather, the idea, in my opinion, could be to research alternative cellular energy substrates, such as glutamine, etc., in those contexts.

Saturday, April 11, 2009

Trophic Effects of Exercise on Motor Neurons: Potential Involvement of Satellite Cell-Derived IGF-I and Other Skeletal Muscle-Derived Growth Factors

This article [Gardiner et al., 2006: (http://0-jap.physiology.org.library.pcc.edu/cgi/content/full/101/4/1228)(http://www.ncbi.nlm.nih.gov/pubmed/16778002?dopt=Abstract)] is really interesting, and the authors discuss research that provides evidence for trophic actions or the retrograde axonal transport of growth factors, produced locally in the skeletal muscles in response to exercise, on the motor neurons that innervate skeletal muscles. Insulin-like growth factor-I (IGF-I), for example, is produced by satellite cells in response to strength training and can act on motor neurons innervating the muscles [Neff et al., 1993: (http://www.ncbi.nlm.nih.gov/pubmed/8301266)]. A growth factor like IGF-I can modify the release of other growth factors in the spinal cord, such as by undergoing retrograde axonal transport or by increasing the activities of enzymes that are components of intracellular signalling cascades. IGF-I could modify gene transcription in the cell body of a motor neuron, for example, by activating various mitogen-activated protein kinase cascades near the axon terminal of the motor neuron. The IGF-I might then be degraded before it undergoes axonal transport, etc. Gardiner et al. (2006) discuss research showing that exercise increases BDNF expression (mRNA content) in the lumbar spinal cord in animals and also can increase the axonal diameters of motor neurons. In my opinion, resistance exercise (strength training) provides a much more potent stimulus for these types of trophic effects on the brain and spinal cord, and there's been a great deal of research on these types of effects of strength training or other forms of exercise. One problem is that it seems difficult to set up experiments to allow animals to do "strength training." One can make them run at a high intensity on some kind of treadmill, etc., but I would think it would be difficult to really reproduce the effects of resistance exercise in animal experiments. I also think there's a lack of recognition of the kinds of mechanisms that could come into play in response to resistance exercise. Growth factors are produced by satellite cells in the skeletal muscles, in response to some types of exercise (especially resistance exercise), and those growth factors could conceivably be transported along both the afferent sensory fibers innervating the skeletal muscles and the motor neurons themselves. The exercise-induced increases in the firing rates of afferent sensory neurons could also provide trophic effects in the spinal cord, etc.

Friday, April 10, 2009

Report of Liver Dysfunction or Damage Associated With Creatine/Whey Protein Ingestion: Potential Relation to the Urea Cycle

This article [Whitt et al., 2008: (http://www.ncbi.nlm.nih.gov/pubmed/18452122)] discusses a case of reversible liver dysfunction/damage associated with creatine intake. I can't get the full text of the article at the moment, but there's also a study showing liver damage in animals on high-dose creatine [Tarnopolsky et al., 2003: (http://ajpregu.physiology.org/cgi/content/full/285/4/R762)(http://www.ncbi.nlm.nih.gov/pubmed/12959920?dopt=Abstract)]. I do think it might be possible for creatine to produce liver dysfunction at excessive dosages, in the long term, but that's just my opinion. The articles on the "alternate urea cycle" that occurs in people with chronic renal failure suggest that creatine or other guanidino compounds that have sometimes been shown to accumulate, at relatively low levels in humans taking 5 grams/d of creatine [Derave et al., 2004: (http://jap.physiology.org/cgi/content/full/97/3/852)(http://www.ncbi.nlm.nih.gov/pubmed/15107411?dopt=Abstract)], could inhibit urea cycle enzymes or interfere with the transport of urea cycle intermediates, etc., and produce liver dysfunction, in my opinion. This doesn't mean that creatine is "toxic" or "bad" but just means that there tends to be a therapeutic dosage range for a physiological substrate, such as creatine, and then a dosage range across which derangements in the transport or utilization of substrates with similar structures may begin to occur. That's just my opinion, but I discussed research, in a past posting, showing that high levels of glutamine, for example, could reduce endothelial nitric oxide production from eNOS, and the effect was thought to be due to the inhibition, by glutamine, of citrulline uptake or reutilization [Kawaguchi et al., 2005, cited and discussed here: (http://hardcorephysiologyfun.blogspot.com/2009/03/inhibition-of-nitric-oxide-dependent.html)]. Citrulline is a urea cycle intermediate, and so it's conceivable, in my opinion, that abnormally-high levels of glutamine or a guanidino compound derived from the transamination of arginine could, in the liver, interfere with the metabolism of citrulline in the urea cycle, etc. Glutamine, citrulline, and creatine are structurally similar, to some extent, and there's a lot of research showing competition for transport among those compounds or competitive inhibitory effects on enzymes, etc. I don't have time to get into the research on guanidinosuccinate, which has been shown to decrease in response to creatine supplementation, or argininic acid and the other guanidino compound that can accumulate during creatine supplementation (Derave et al., 2004, cited above). This could be another reason to consider using the lower end of the dosage range of creatine monohydrate used in clinical trials (~3-10 grams/d), as discussed previously, and to discuss this with one's doctor.

Wednesday, April 8, 2009

Adenosine, PKA Activity, CREB Activation, Synapsin I Activation, and the Actions of Antidepressants

These articles [Consogno et al., 2001: (http://www.ncbi.nlm.nih.gov/pubmed/11543736); Zanotti et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9833637)] show that S-adenosylmethionine (SAM-e) increased the activity of calmodulin-dependent protein kinase II (CaMK-II) and protein kinase A [cyclic adenosine monophosphate (cAMP) dependent protein kinase] in various parts of the brain, and some of these increases were similar to changes that are sometimes produced by various antidepressant drugs. I think those effects are due to the actions of adenosine, derived from the exogenous SAM-e, as I've discussed in recent postings (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html). For example, Cheng et al. (2002) found that the activation of A2A adenosine receptors produced an increase in (or normalization of, in the face of the inhibition of one of the intracellular, mitogen-activated protein kinase cascades that is activated by NGF) the NGF-induced neurite growth in cultured neurons, and this effect of A2A adenosine receptor activation was dependent on the PKA-induced activation (phosphorylation) of cAMP response element binding protein (CREB) [Cheng et al., 2002: (http://www.jbc.org/cgi/reprint/277/37/33930)(http://www.ncbi.nlm.nih.gov/pubmed/12114502?dopt=Abstract)]. Similarly, Diogenes et al. (2004) [Diogenes et al., 2004: (http://www.jneurosci.org/cgi/reprint/24/12/2905)(http://www.ncbi.nlm.nih.gov/pubmed/15044529?dopt=Abstract)] found that both A2A adenosine receptor activation and PKA activity were required for the in vitro neurotrophic effects of brain-derived neurotrophic factor (BDNF) on hippocampal neurons. Some of these articles (http://scholar.google.com/scholar?num=100&hl=en&lr=&safe=off&q=CREB+adenosine+activation+A2A+OR+A3) show that adenosine receptor activation can produce an increase in the phosphorylation of CREB by PKA and in various phospho-CREB-induced changes in gene expression in various cell types. An increase in CREB phosphorylation can result from an increase in the activity of CaMK-II, p38 MAPK, PKA, or other intracellular kinase enzymes or signalling pathways, and increases in phospho-CREB levels, in the hippocampus, among other parts of the brain, have traditionally been viewed as one change that accompanies the response to an antidepressant treatment (an antidepressant effect). The reality is probably much more complex, as noted by Manier et al. (2002) [Manier et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/11793165)], given that antidepressants may produce their effects, under some conditions or in some people, by decreasing phospho-CREB levels. There's also research showing that guanosine can increase CREB activation or cAMP levels, but I can't get into all of that now. An increase in CREB activation is just one of many changes that can occur in the brain in response to some antidepressants, but that's the type of mechanism that could, in my opinion, conceivably explain the ways in which longer-term antidepressant effects could emerge out of those short-term antidepressant effects of guanosine and adenosine in animal models of depression (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html). I think an increased in phospho-CREB levels couldn't have occurred over those short-term experiments, but an increase in cAMP could have occurred. I tend to think those short-term effects are more inhibitory and have more to do with inhibition of glutamate release by guanosine or adenosine, etc., as implied by the results. The activation of A1 adenosine receptors and other adenosine receptor subtypes can preserve phosphocreatine, etc.

Incidentally, SAM-e, in one of those articles I cited at the beginning of this posting, was also shown to increase synapsin I protein levels (Consogno et al., 2001), and synapsin I is a protein substrate of CaMK-II (synapsin I is phosphorylated by CaMK-II and thereby enters the cytosol) that can regulate neurotransmitter release. SAM-e produced an increase in the cytosolic, "soluble" synapsin I concentration that was, therefore, likely to have been the result of a SAM-e-induced increase in CaMK-II activity. There's also a considerable amount of research showing reductions or changes in synapsin I protein levels in people with depression, etc. (http://scholar.google.com/scholar?q=%22synapsin+I%22+antidepressant&hl=en&lr=).

Sunday, April 5, 2009

Research on the Use of Creatine Monohydrate as an Adjunctive to Prescription Antidepressants

These are some more articles discussing the use of creatine in combination with prescription antidepressants or medications to treat Parkinson's disease [Roitman et al., 2007: (http://www.ncbi.nlm.nih.gov/pubmed/17988366); Amital et al., 2006: (http://ajp.psychiatryonline.org/cgi/reprint/163/10/1840-b.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/17012702); Bender et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/17030762)]. Amital et al. (2006) researched the effects of creatine at 3 grams per day, for one week, and then 5 grams per day, for four weeks (and then also for eight weeks after the end of the trial), in a person who was being treated with prescription medications for post-traumatic stress disorder and depression with fibromyalgia. The authors found that, during the course of the trial, the person's scores on the Hamilton Depression Rating Scale had decreased, indicating that the creatine had evidently produced an antidepressant effect in the person, and that the person's symptoms of fibromyalgia had improved. Roitman et al. (2007) found that the full antidepressant effect in the patients had required four weeks to emerge, but the greatest reductions in the Hamilton Depression Rating Scale had occurred within the first two weeks in most of the people. Roitman et al. (2007) also found that some people experienced a greater improvement in mood from 3 grams/day of creatine than from 5 grams/day.

Bender et al. (2006) used 4 grams a day of creatine for two years (after an initial phase of 20 grams per day for 6 days and then 2 grams per day for six months), in a randomized, placebo-controlled trial and found that the people who had taken creatine had not had to increase the dosages of their dopaminergic medications for Parkinson's disease to the extent that the people taking the placebo had had to increase their dosages, and that effect was statistically-significant. The other main result of the trial was the finding that creatine had produced a statistically-significant reduction in symptoms of depression, as measured by the "score" on Item 3 of Part I of the Unified Parkinson's Disease Rating Scale (UPDRS). Incidentally, I don't think the "loading dose" concept of creatine makes much sense, especially given the potential for high-dose creatine to produce plasma and extracellular fluid volume expansion in some people [Powers et al., 2003: (http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=155510&blobtype=pdf)(http://www.ncbi.nlm.nih.gov/pubmed/12937471)]. To the extent that that effect would occur more with high-dose than with low-dose creatine, one would expect the body to be less able to adapt or compensate to that effect in the context of a "loading dose." But that's just my opinion. Obviously, one would want to discuss this with one's doctor before using creatine or any supplement, particularly given that Roitman et al. (2007) found that, in one of the ten patients, creatine produced transient improvement in the person's mood and then worsened it. Roitman (2007) also found that creatine produced mania or hypomania in the people in the trial who had been diagnosed with bipolar disorder.

In my opinion, creatine would be more effective in this context in combination with adenosine monophosphate/triphosphate or guanosine monophosphate or both, for complex reasons. I think it's a really important point, but it's just my opinion. I could explain my reasoning in more detail, but I'd have to cite a lot more articles and can't do that right now. I discussed some of the rationale for the combination in a past posting (http://hardcorephysiologyfun.blogspot.com/2009/03/creatine-cr-phosphocreatine-pcr-and.html), and I discussed some of the research and details on orally-administered guanosine monophosphate and adenosine monophosphate or adenosine triphosphate in this posting (http://hardcorephysiologyfun.blogspot.com/2009/03/adenosine-and-guanosine-in-animal.html). Other approaches that might potentiate the effects of creatine, in my opinion, would be the use of methylcobalamin at ~5 mg/day or something (with the intent of disinhibiting creatine kinase enzymes and tricarboxylic acid cycle enzymes by reducing methylmalonic acid levels), the use of L-methylfolate or levoleucovorin in combination with methylcobalamin (with the intent of reducing the accumulation of sarcosine, which may be produced from high-dose creatine, given that folate depletion can compromise the metabolism of sarcosine and dimethylglycine and betaine, as I've discussed in the context of research cited in past postings), and the use of adenosine and guanosine to elevate serum uric acid to high-normal levels, under a doctor's supervision (with various "purposes" in mind, including the disinhibition of creatine kinase, which is inhibited by peroxynitrite, through the scavenging of peroxynitrite by uric acid) [discussed and cited here: (http://hardcorephysiologyfun.blogspot.com/2009/03/creatine-cr-phosphocreatine-pcr-and.html)]. Poortmans et al. (2005) [Poortmans et al., 2005: (http://www.ncbi.nlm.nih.gov/pubmed/16260971)] discussed the fact that the elevation of urinary methylamine and formaldehyde, in response to high-dose creatine administration, could have resulted from the metabolism of some of the creatine to sarcosine, evidently by the creatinase enzymatic activity in microorganisms in the G.I. tract, if memory serves (humans don't express a creatinase enzyme, as far as is known). Sarcosine can be metabolized to methylamine by sarcosine reductase or to formaldehyde and glycine, by sarcosine oxidase, and methylamine can be metabolized to formaldehyde and ammonia by semicarbazide-sensitive amine oxidase (SSAO) (Poortmans et al., 2005). Formaldehyde can then be converted into formate, and the activities of the folate cycle enzymes, which are dependent upon adequate pools of reduced folates, such as L-methylfolate or levoleucovorin, and vitamin B12 (i.e. methylcobalamin), are important for the normal metabolism of formate.