This article [Yamauchi et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/11934546)] is interesting, and the authors found that exogenous arginine supplementation only increased cell proliferation, in cultured epithelial cells, when extra glutamine was also added. The effects of arginine+glutamine occurred largely through the capacity of glutamine to serve as a precursor for purine and pyrimidine nucleotides formed by their respective de novo biosynthetic pathways. Glutamine has been shown to specifically enhance de novo purine biosynthesis in the same cell line that Yamauchi et al. (2002) used (Caco-2 colonic epithelial cells) [Boza et al., 2000: (http://www.ncbi.nlm.nih.gov/pubmed/10900556)]. The influence of glutamine availability on de novo uridine (pyrimidine) biosynthesis is more well-known, but the main point of Yamauchi et al. (2002) is that glutamine alone was less effective in supporting cell proliferation than was the combination of glutamine and arginine together. The authors found that the presence of extra arginine reduced the conversion of glutamine into arginine, via the conversion of arginine into glutamate. Glutamate did not substitute for glutamine in these experiments, and that's what one would expect (given, in my opinion, the ATP requirement for the conversion of large amounts of glutamate into glutamine, as discussed by researchers). There could be some ATP buffering in response to exogenous glutamate, under some circumstances, given its potential to be converted into alpha-ketoglutarate, for example, and help to maintain the pools of tricarboxylic acid cycle intermediates. But the glutamate would have to be converted into glutamine to serve as a precursor of purines or pyrimidines.
Yamauchi et al. (2002) also found that exogenous nucleotides "spared" glutamine, as would be expected, by reducing its incorporation into nucleotides formed by the de novo pathways. This glutamine-sparing effect of exogenous nucleotides has been shown in other articles, and researchers would usually attribute this to the standard mechanisms by which purines or pyrimidines suppress their respective de novo biosynthetic pathways, by allosteric effects. But the author of that article I cited in my last posting was referring to "nonstandard" mechanisms by which exogenous purine nucleotides could deplete pyrimidines from different cell types. Two mechanisms that Shambaugh (1979) [(http://hardcorephysiologyfun.blogspot.com/2009/02/abbreviated-posting-on-carbamoyl.html)] mentioned are the inhibition of uridine kinase by purines and the inhibition of the cell-cycle-associated upregulation of carbamoyl phosphate synthetase II (CPSII) by purines. Traditionally, only uridine nucleotides have been thought to exert feedback suppression of de novo uridine formation, by inhibiting CPSII activity.
The interesting point that shows up in one of the graphs in Yamauchi et al. (2002) and in the results of Boza et al. (2000) is that exogenous nucleotides can't substitute for glutamine in some cell types, and the enhancement of cell proliferation that occurs in response to exogenous nucleotides tends to "plateau" in some cell types. One might suppose that it's the high rate of de novo nucleotide biosynthesis that accounts for this, but Yamauchi et al. (2002) note that Caco-2 cells have a low capacity to produce nucleotides by the de novo pathways. But Boza et al. (2000) found that glutamine (or, more specifically, if one takes the results of Yamauchi et al. (2002) into account, glutamine+arginine) could increase the retention of exogenous nucleotides by enhancing the activities of nucleotide salvage enzymes (presumably by providing tricarboxylic acid cycle intermediates or by conversion into glucose, etc.). One way of explaining that is the fact that nucleotide biosynthesis increases dramatically at specific points in the cell cycle, and exogenous nucleotides could enter cells in one part of the cell cycle and then be exported and act, extracellularly, on plasma membrane adenosine or purinergic receptors and regulate the cell cycle in aberrant ways (or be converted into uric acid, etc.). I have an article that talks about that, the fact that glutamine is used as a source of glucose or tricarboxylic acid cycle substrates most of the time, in proliferating lymphocytes, but then becomes crucially important for pyrimidine biosynthesis at specific points in the cell cycle. The toxic effects of excessive amounts of purines on lymphocytes used to be explained in terms of the conversion of adenosine into deoxyadenosine, which can inhibit ribonucleotide reductase (thereby inhibiting DNA replication) and S-adenosylhomocysteine hydrolase (thereby interfering with one-carbon metabolism and nucleotide metabolism in other ways). But, in the context of supplementation in humans without known genetic defects in adenosine deaminase activity (as in severe combined immunodeficiency), other mechanisms could produce undesirable effects, in my opinion, at high doses, and some of the effects could be independent of hyperuricemia.
But one key point that comes out of these articles is that glutamine and arginine could be used, in my opinion, to reduce the therapeutic dosages of nucleotides, in the context of the conditions that exogenous purines or pyrimidines have been used or researched in the treatment of. The suppression of CPSII by purines would not be completely undesirable, however, given that the excessive activation of de novo pyrimidine biosynthesis can lead to orotate accumulation, and orotate tends to exert many toxic effects. It's used in one animal model of fatty liver disease and can produce ATP depletion in other cell types. Orotate is an intermediate in uridine biosynthesis. Glutamine could also increase AICAR levels, and reduced folates, in combination with methylcobalamin, could be useful as strategies to ensure that AICAR does not accumulate in the presence of exogenous glutamine and arginine. Folates have also been shown to reduce orotate accumulation [Van der Weyden et al., 1979, cited here: (http://hardcorephysiologyfun.blogspot.com/2009/01/purines-and-orotic-acid-in-porphyrias.html)] in proliferating cells, but I don't know what the mechanism is. Another implication is that some of the neuroprotective effects of uridine or purines, such as guanosine and adenosine or inosine, may be due to the "sparing" of the intracellular glutamine pools in cells in the brain (via the suppression of CPSII and via the standard mechanisms exerted by purines, which are the suppression of PRPP synthetase and amidophosphoribosyltransferase activities), thereby maintaining the pool of tricarboxylic acid cycle intermediates or supporting glycolysis by ribose-5-phosphate-independent mechanisms, etc.
Saturday, February 28, 2009
Friday, February 27, 2009
Abbreviated Posting on Carbamoyl Phosphate Synthetase II and Nucleotides
This is a really good article [Shambaugh, 1979: (http://www.ajcn.org/cgi/reprint/32/6/1290.pdf)(http://www.ncbi.nlm.nih.gov/pubmed/35970)], and the author cites research showing that the administration of either exogenous purines (nucleotides of adenosine, guanosine, or inosine) or exogenous pyrimidines (nucleotides of uridine or cytidine or both), given separately, has been shown to suppress carbamoyl phosphate synthetase II in various cell types (discussed on the last two pages). The author also goes over other mechanisms by which an excess of purines could produce depletion of pyrimidine nucleotides, such as uridine, by inhibiting uridine biosynthesis in the de novo pathway (which is dependent on glutamine and can be influenced by glutamine availability). This would be beneficial, to a certain extent, because an excess of carbamoyl phosphate, which is known to be exported from the mitochondrial pool (which is usually compartmentalized, in liver cells, so as to only be available for entry into the urea cycle) when the urea cycle cannot dispose of ammonia rapidly enough, tends to lead to an increase in the production of orotate. Orotate (a.k.a. the conjugate base of orotic acid) is normally a precursor of uridine, but inhibition of mitochondrial respiration is known to prevent uridine formation from dihydroorotate and to be associated with orotate accumulation. Orotate is well-known to be capable of inducing ATP depletion and fatty liver disease, and there are countless articles showing that dietary arginine depletion can, over the long term or under conditions of metabolic stress or glucocorticoid resistance (chronically-elevated glucocorticoid production tends to increase the activities of the urea cycle enzymes, and this eventually can put a strain on the capacity of the liver to maintain its arginine pool from dietary sources and endogenous biosynthesis from glutamate, via arginase and ornithine delta-aminotransferase), lead to fatty liver disease by elevating orotate levels (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=arginine+orotate+OR+orotic). That second article in the list is a bad one, in my opinion, but I don't have time to go through it. I actually don't have time to finish this posting now.
Thursday, February 26, 2009
The Urea Cycle, Renal Glutaminase Activity, and Arginine in the Regulation of Serum Bicarbonate, Nucleotide Metabolism, and Cellular Energy Metabolism
This article is really interesting [Haussinger et al., 1990: (http://www.springerlink.com/content/l2vx314521367706/)], and the authors discuss the way the urea cycle, in the liver, normally consumes a lot of bicarbonate, during the overall conversion of ammonia and bicarbonate into urea and carbon dioxide, and tends to decrease serum bicarbonate. When the liver function of a person becomes impaired, the kidneys dispose of more ammonia in the urine, thereby acidifying the urine and further increasing the serum bicarbonate (which tends to be elevated because of urea cycle dysfunction in liver disease).
This is relevant to supplementation with glutamine and arginine in the context of any number of conditions, including sports nutrition or any other condition in which glutamine and arginine have been used (many different conditions). Glutamine supplementation can increase urinary ammonia loss and elevate serum bicarbonate [Welbourne, 1995: (http://hardcorephysiologyfun.blogspot.com/2009/02/potential-for-hypophosphatemia-or.html)], but this effect of glutamine is, in my opinion, more highly regulated and responsive to changes in serum bicarbonate than the effect of alpha-ketoglutarate, which abnormally elevates the cytosolic, as opposed to intramitochondrial, pool of alpha-ketoglutarate, is. Haussinger et al. (1990) cite reference 53 as evidence that, in people who are already displaying metabolic alkalosis, meaning that the serum bicarbonate is elevated, glutamine supplementation or infusion does not increase urinary ammonia loss (and would therefore not be expected to further elevate serum bicarbonate or abnormally increase urinary phosphate loss). But another message of this and other articles is that this homeostatic functioning of renal glutaminase will, essentially, only function properly in a person whose serum ammonia levels are not elevated. This means that the urea cycle in the liver has to be functioning normally, or the homeostatic stopgap measures for disposing of ammonia, such as in the urine in unusually large amounts, in a manner that is dependent on renal glutaminase activity and could be sensitive to changes in glutamine availability through supplementation, have the potential to create wild fluctuations in serum bicarbonate.
The message of this and the other articles I've been reading is, in my opinion, that arginine supplementation could help to minimize the risk of acid-base disturbances (and the consequent disturbances reductions in serum ionized calcium and serum phosphate that can result from increases in serum bicarbonate, such as could conceivably occur in response to glutamine supplementation) in response to glutamine supplementation. The authors of this article [Feillet et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9686348)] discuss the fact that supplemental L-arginine is used in the treatment of all of the genetic defects, except arginase deficiency, that produce hypofunctionality in urea cycle enzymes. Supplementing with arginine has, in my opinion, major advantages over supplementing with ornithine or citrulline. These advantages are that the biosynthesis of arginine from citrulline requires two moles of ATP per mole of arginine synthesized, and Morris et al. (2004) [see here for reference: (http://hardcorephysiologyfun.blogspot.com/2009/02/contribution-of-glutamine-to-pool-of.html)] referred to this, implicitly, as being a rationale for the use of supplemental arginine under conditions of metabolic stress. Another advantage of arginine is the fact that an increase in the dietary ratio of ornithine to arginine may decrease creatine biosynthesis in the kidneys, via the ornithine-induced inhibition of renal arginine:glycine amidinotransferase (AGAT) activity [see Crim et al. (1976), Stromberger et al. (2003), and Arias et al. (2004), cited here: (http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html)]. Moreover, arginosuccinate synthetase is the rate-limiting step in the urea cycle, and the provision of arginine effectively bypasses this rate-limiting step and simultaneously provides a source of intracellular, "compartmentalized" ornithine that can enter the urea cycle. This might seem paradoxical, given that arginine essentially provides more nitrogen than ornithine. But it's partly the fact that arginine serves as a kind of metabolic "branch point" substrate that is in high demand by nitric oxide synthase enzymes and by many other enzymes, such as those that use arginine for polyamine biosynthesis, protein synthesis, creatine biosynthesis, etc. A similar rationale applies to the use of L-glutamine [Young and Ajami, 2001: (http://jn.nutrition.org/cgi/reprint/131/9/2449S) (http://www.ncbi.nlm.nih.gov/pubmed/11533293?dopt=Abstract)], as opposed to alpha-ketoglutarate or glutamate (glutamine synthetase consumes ATP, and exogenous glutamine spares glutamine and ATP by limiting the consumption of glutamine and ATP by glutamine synthetase). But there's the additional fact that fairly low doses of calcium alpha-ketoglutarate were shown to produce vomiting and other gastrointestinal symptoms in people in one study, in people with kidney disease [Bro et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9469496)]. Five of 17 people who had taken calcium alpha-ketoglutarate had to drop out of that arm of the trial because of those types of symptoms, and alpha-ketoglutarate reduced both serum ionized calcium and serum phosphate (Bro et al., 1998). I've explained, in past postings, the mechanisms I think are responsible for those effects, and, in my opinion, those reductions in serum ionized calcium and serum phosphate, reductions that may occur in the context of abnormally-elevated ammonia due to liver disease or kidney disease or other states of disease or metabolic or physical stress, are most likely to have been a consequence of the abnormal acceleration, by alpha-ketoglutarate, of urinary acidification (and concomitant elevation of serum bicarbonate) via renal ammonia disposal. And if one has to limit the dosages of alpha-ketoglutarate, because of its gastrointestinal side effects, to doses that are so small as to have only a minimal effect of sparing glutamine at extrarenal sites, that's a major disadvantage, in my opinion.
These concepts are relevant to the potential effects of arginine or glutamine in the brain or in cerebral vascular endothelial cells, but I don't have time to discuss that now. One major point of interaction is between glutamine-dependent, de novo uridine biosynthesis (and also de novo purine biosynthesis, which requires glutamine), at the level of carbamoyl phosphate synthetase II, and the urea cycle. But I also think that type of interaction is relevant in cells that lack the full complement of urea cycle enzymes (cells outside the liver). But impairment of the overall activities of the urea cycle enzymes abolishes, to a significant extent, the feedback inhibition of carbamoyl phosphate synthetase II by uridine triphosphate and uridine diphosphate. This leads to excessive orotate accumulation, and orotate has a multitude of toxic effects and can produce significant ATP depletion, etc. In my opinion, this would suggest the usefulness of arginine and glutamine, in some sort of balanced dosage approach, in combination with uridine, in the context of neuroprotection and other purposes that supplemental uridine has been used or researched in the treatment of. The potential usefulness of exogenous arginine as a strategy for elevating creatine biosynthesis in the brain has been proposed by Arias et al. (2004) [see here for that reference and related references: (http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html)], and, in my opinion, the use of arginine would be a safer approach for elevating creatine in the brain, such as in the context of psychiatric diseases [see Roitman et al. (2007), cited in that blog posting] than the use of creatine itself would be. I say that for reasons that are too numerous to go into.
This is relevant to supplementation with glutamine and arginine in the context of any number of conditions, including sports nutrition or any other condition in which glutamine and arginine have been used (many different conditions). Glutamine supplementation can increase urinary ammonia loss and elevate serum bicarbonate [Welbourne, 1995: (http://hardcorephysiologyfun.blogspot.com/2009/02/potential-for-hypophosphatemia-or.html)], but this effect of glutamine is, in my opinion, more highly regulated and responsive to changes in serum bicarbonate than the effect of alpha-ketoglutarate, which abnormally elevates the cytosolic, as opposed to intramitochondrial, pool of alpha-ketoglutarate, is. Haussinger et al. (1990) cite reference 53 as evidence that, in people who are already displaying metabolic alkalosis, meaning that the serum bicarbonate is elevated, glutamine supplementation or infusion does not increase urinary ammonia loss (and would therefore not be expected to further elevate serum bicarbonate or abnormally increase urinary phosphate loss). But another message of this and other articles is that this homeostatic functioning of renal glutaminase will, essentially, only function properly in a person whose serum ammonia levels are not elevated. This means that the urea cycle in the liver has to be functioning normally, or the homeostatic stopgap measures for disposing of ammonia, such as in the urine in unusually large amounts, in a manner that is dependent on renal glutaminase activity and could be sensitive to changes in glutamine availability through supplementation, have the potential to create wild fluctuations in serum bicarbonate.
The message of this and the other articles I've been reading is, in my opinion, that arginine supplementation could help to minimize the risk of acid-base disturbances (and the consequent disturbances reductions in serum ionized calcium and serum phosphate that can result from increases in serum bicarbonate, such as could conceivably occur in response to glutamine supplementation) in response to glutamine supplementation. The authors of this article [Feillet et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9686348)] discuss the fact that supplemental L-arginine is used in the treatment of all of the genetic defects, except arginase deficiency, that produce hypofunctionality in urea cycle enzymes. Supplementing with arginine has, in my opinion, major advantages over supplementing with ornithine or citrulline. These advantages are that the biosynthesis of arginine from citrulline requires two moles of ATP per mole of arginine synthesized, and Morris et al. (2004) [see here for reference: (http://hardcorephysiologyfun.blogspot.com/2009/02/contribution-of-glutamine-to-pool-of.html)] referred to this, implicitly, as being a rationale for the use of supplemental arginine under conditions of metabolic stress. Another advantage of arginine is the fact that an increase in the dietary ratio of ornithine to arginine may decrease creatine biosynthesis in the kidneys, via the ornithine-induced inhibition of renal arginine:glycine amidinotransferase (AGAT) activity [see Crim et al. (1976), Stromberger et al. (2003), and Arias et al. (2004), cited here: (http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html)]. Moreover, arginosuccinate synthetase is the rate-limiting step in the urea cycle, and the provision of arginine effectively bypasses this rate-limiting step and simultaneously provides a source of intracellular, "compartmentalized" ornithine that can enter the urea cycle. This might seem paradoxical, given that arginine essentially provides more nitrogen than ornithine. But it's partly the fact that arginine serves as a kind of metabolic "branch point" substrate that is in high demand by nitric oxide synthase enzymes and by many other enzymes, such as those that use arginine for polyamine biosynthesis, protein synthesis, creatine biosynthesis, etc. A similar rationale applies to the use of L-glutamine [Young and Ajami, 2001: (http://jn.nutrition.org/cgi/reprint/131/9/2449S) (http://www.ncbi.nlm.nih.gov/pubmed/11533293?dopt=Abstract)], as opposed to alpha-ketoglutarate or glutamate (glutamine synthetase consumes ATP, and exogenous glutamine spares glutamine and ATP by limiting the consumption of glutamine and ATP by glutamine synthetase). But there's the additional fact that fairly low doses of calcium alpha-ketoglutarate were shown to produce vomiting and other gastrointestinal symptoms in people in one study, in people with kidney disease [Bro et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9469496)]. Five of 17 people who had taken calcium alpha-ketoglutarate had to drop out of that arm of the trial because of those types of symptoms, and alpha-ketoglutarate reduced both serum ionized calcium and serum phosphate (Bro et al., 1998). I've explained, in past postings, the mechanisms I think are responsible for those effects, and, in my opinion, those reductions in serum ionized calcium and serum phosphate, reductions that may occur in the context of abnormally-elevated ammonia due to liver disease or kidney disease or other states of disease or metabolic or physical stress, are most likely to have been a consequence of the abnormal acceleration, by alpha-ketoglutarate, of urinary acidification (and concomitant elevation of serum bicarbonate) via renal ammonia disposal. And if one has to limit the dosages of alpha-ketoglutarate, because of its gastrointestinal side effects, to doses that are so small as to have only a minimal effect of sparing glutamine at extrarenal sites, that's a major disadvantage, in my opinion.
These concepts are relevant to the potential effects of arginine or glutamine in the brain or in cerebral vascular endothelial cells, but I don't have time to discuss that now. One major point of interaction is between glutamine-dependent, de novo uridine biosynthesis (and also de novo purine biosynthesis, which requires glutamine), at the level of carbamoyl phosphate synthetase II, and the urea cycle. But I also think that type of interaction is relevant in cells that lack the full complement of urea cycle enzymes (cells outside the liver). But impairment of the overall activities of the urea cycle enzymes abolishes, to a significant extent, the feedback inhibition of carbamoyl phosphate synthetase II by uridine triphosphate and uridine diphosphate. This leads to excessive orotate accumulation, and orotate has a multitude of toxic effects and can produce significant ATP depletion, etc. In my opinion, this would suggest the usefulness of arginine and glutamine, in some sort of balanced dosage approach, in combination with uridine, in the context of neuroprotection and other purposes that supplemental uridine has been used or researched in the treatment of. The potential usefulness of exogenous arginine as a strategy for elevating creatine biosynthesis in the brain has been proposed by Arias et al. (2004) [see here for that reference and related references: (http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html)], and, in my opinion, the use of arginine would be a safer approach for elevating creatine in the brain, such as in the context of psychiatric diseases [see Roitman et al. (2007), cited in that blog posting] than the use of creatine itself would be. I say that for reasons that are too numerous to go into.
Monday, February 23, 2009
Note on the Orange Juice
I forgot to mention that, in that article by Welbourne (1995) [(http://hardcorephysiologyfun.blogspot.com/2009/02/glutamine-decreases-plasma-free-fatty.html)], the author used an orange juice drink with the glutamine. I'm not sure what the percent of the total amount of sugars in the orange juice is fructose, but I think it's high. Fructose has a low glycemic index of like 20 or something, and it doesn't produce much insulin release. That's probably one reason the orange juice didn't interfere with the effect. But the author would have been better off just being systematic about it and avoiding the confounding influence of "the orange juice." Glutamine, incidentally, can be gluconeogenic without elevating insulin levels, but I can't get into that now.
Note on the False Transmitter Effect From L-5-HTP
This is one article showing the false-transmitter effect of L-5-hydroxytryptophan (L-5-HTP) [Stamford et al., 1990: (http://www.ncbi.nlm.nih.gov/pubmed/2357555)]. That effect wouldn't occur with L-tryptophan (dopaminergic neurons don't express tryptophan hydroxylase, at least not in all but some few subsets that could conceivably co-express tyrosine hydroxylase and tryptophan hydroxylase). It's sort of interesting. A lot of articles showing neuroendocrine effects have used a peripheral decarboxylase inhibitor in combination with 5-HTP (i.e. benserazide or carbidopa). But that wouldn't prevent the false transmitter effect. It would potentiate it (taking a peripheral decarboxylase inhibitor with it, which one couldn't do without a prescription anyway), to the extent that the decarboxylase inhibitor would increase the entry of 5-HTP into the brain. But as far as interpreting research on 5-HTP, one has to see if the researchers used carbidopa or benserazide, etc. I don't know how much of a dose would be taken up by dopaminergic neurons, and I haven't read the full text of that article (in a long time, if at all).
Glutamine Decreases Plasma Free Fatty Acids and Then Accelerates Postprandial Lipolysis: Relevance to Mechanism of Growth Hormone Release
There are too many articles on this subject to cite all of them, but I was reading about glutamine in relation to cellular energy metabolism and the urea cycle, etc., and I came across some information that would be relevant to the issue of growth hormone (GH) release. I think some researchers in the literature are still scared of GH or think it's going to make people crazy or give them some kind of scary roid rage, but the articles discuss the fact that, in the absence of exercise, the effect of it is mainly on body composition. The elevations of IGF-1, to the extent that they occur in response to exercise, are more from the exercise itself. The IGF-1 elevations following non-exercise-induced GH release tend to be less pronounced and variable. At least that's my impression. The effects that GH has on increasing lipolysis and inhibiting insulin-induced triglyceride storage in adipocytes are IGF-1-independent, however.
I'll just put this information up here, because, in spite of the polarized quality of the articles, in which GH release frequently tends to be either demonized or viewed as a panacea, GH can sometimes have vasculoprotective effects and can increase exercise tolerance in people who have difficulty exercising, etc. The last time I read about this in any detail was quite awhile ago, and this will be hastily-done. But, in any case, GH is a double-edged sword, though, in terms of its effects on insulin sensitivity. If a person has an abnormally low degree of GH release, such as in response to high-intensity exercise or sleep, as in "GH deficiency," normalizing or increasing GH release can increase insulin sensitivity. Under other circumstances, the GH-induced increases in free fatty acids (FFAs) is thought to be capable of worsening insulin sensitivity. One mechanism is that the increases in FFAs can impair either glucose oxidation or glucose transport, etc., and thereby produce insulin resistance. That's one reason one should discuss this with his or her doctor before using these types of methods for inducing GH release. It's generally a good idea to be exercising, also, with this type of thing. That's the usual advice, but one relevant effect of exercise is to decrease fasting triglyceride levels or to offset any potential for the insulin resistance that can result from GH-induced elevations in FFAs, etc. Exercise improves insulin sensitivity by many other mechanisms, too, and fasting hyperglycemia will tend to blunt the effects of these types of approaches.
People seem to have forgotten, in many of the articles, some of the basic premises that one has to consider to use GH releasers. If one were going to use them, there would be two or three windows, during the day, at which time they would probably be most effective, in my opinion. For GH release to occur, in the absence of some stimulus (such as sleep or high-intensity exercise), insulin and glucose levels, in particular, have to be low. Free fatty acids also should be low, and there's evidence that the nocturnal and exercise-induced pulsatile releases of GH are attenuated or limited ("brought to a halt" or cut short, etc.) by the GH-induced elevations in FFAs. So the morning, before breakfast, is one time at which many of the researchers have administered GH releasers under experimental conditions. The glucose and insulin levels are very low (the lowest at any point in the day, as far as I know, under normal circumstances), and this makes the fasted state ideal for those reasons. The FFAs, on the other hand, tend to be highest in the morning and are very high at night, in part, as researchers have suggested in many articles, because of the nocturnal GH release that occurs about 90 minutes into sleep. That's one reason something like L-glutamine, which, at a dose of 2-3 grams, lowered FFAs in association with GH release [Welbourne, 1995: (http://www.ncbi.nlm.nih.gov/pubmed/7733028)] would, in my opinion, be one approach for taking advantage of that window, especially in combination with L-arginine. The GH release that occurs in response to arginine is inhibited by elevations in FFAs, meaning that glutamine could reasonably be expected to enhance arginine-induced GH release. It's bizarre to me that hundreds of articles are reporting on using massive doses of L-glutamine (the usual dose in many studies is 30-50 grams per day), and only one or two studies have evaluated glutamine use in the context of GH release in humans. There's one other article, besides that article by Welbourne (1995), that used another bizarrely low dose of glutamine in combination with glycine and niacin, I think. Glycine, especially, in my opinion, would be a really bad choice as a GH releaser, because it's been implicated in uremic encephalopathy and is basically an excitatory neurotransmitter, for all practical purposes, whose entry into the brain is almost unregulated. This is in contrast to glutamate and glutamine, whose uptake from the blood into the brain is limited and highly regulated. An increase in the intracellular glutamine or glutamate concentration in cerebral microvascular endothelial cells, in the blood vessels lining the brain, can decrease glutamate or glutamine efflux, but the entry of glutamine or glutamate into the brain is, in my opinion, much more highly-regulated than the entry of glycine into the brain is. I read one article stating that glycine has a 20-hour half-life in the brain, and glycine encephalopathy, due to hyperglycinemia, can occur in all sorts of conditions. In any case, one window is in the fasted state, and one would have to wait about 60-90 minutes, at least, for the GH release to occur. Then, the IGF-1 production, to the extent that it occurs, would occur postprandially. The fat oxidation would also occur postprandially, etc. I'm not making any value-laden statement by providing this information, in any case.
The dosage range for arginine is 6-9 grams, and there's a recent article in Growth Hormone and IGF Research showing that dosages in excess of 9 grams (13 grams was used in the article) produce dose-limiting side effects of stomach upset, etc. The dosage range for glutamine is large, and I don't know what to say on that. In my recent postings, I noted the potential for glutamine to increase phosphate excretion. The increases in plasma bicarbonate that can occur in response to the glutamine-induced urinary acidification, which also tends to concomitantly increase phosphate excretion, also tend to acutely decrease the serum ionized total calcium and have been shown to increase urinary calcium excretion. The potential effect on ionized calcium is a well-known mechanism and occurs because, if I remember, the carboxyl groups on residues of the circulating albumin molecules become deprotonated and bind calcium more effectively. This decreases free (ionized) calcium, at least transiently. I don't know if these effects would always occur or not, but those would be some reasons to be especially careful to talk to one's doctor. Anyway, I don't have time to go through everything, but glutamine also has been shown to elevate plasma arginine and citrulline and alanine and sometimes glutamate (in addition to glutamine). The elevation in citrulline is, according to a wide consensus among researchers, the result of the metabolism of glutamine in the small intestine. The citrulline is then converted into arginine in the kidneys, and the kidneys are normally the main source of blood-borne arginine. The glutamine-induced decrease in FFAs is also probably, in my opinion, one mechanism of GH release. Niacin and the niacin analogue, acipimox, both increase GH by acutely decreasing plasma FFAs. If one looks at the graph of the glutamine-induced decrease in FFAs in this article (about 17 grams, the dose works out to be, and that sounds like sort of a high single dose) [Iwashita et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16517950)], one sees that the graph looks very similar to the one in the Welbourne (1995) article I cited, showing GH release in the context of a glutamine-induced decrease in FFAs (that is followed by an upswing and shows an increase in FFAs). The idea is that the decrease in FFAs, in part, produces GH release, and the GH then increases FFAs. In any case, the glutamine-induced elevations in plasma arginine would also be expected to contribute to the GH release. That article didn't mention GH release, but it's possible, with that high dose of glutamine, that it occurred, even during a meal. Some amino acids and other compounds, such as L-tryptophan, I think, and dopaminergic drugs (L-dopa, for example, which would not be useful, really, in my opinion, and high-intensity exercise), can induce GH release during meals or between meals. L-tryptophan has some theoretical issues with it, and they're saying the increases in metabolites along the kynurenine pathway could produce milder forms of eosinophilic myalgias, etc. It seems a little hard to believe, in my opinion, but it might be one reason to use the lower range of dosages on that. I haven't read about the dosage range for L-tryptophan, but I remember seeing some recent articles in relation to GH release. The potential for increases in prolactin, from elevations in serotonin release by tryptophan, would be one thing I'd be concerned with. Prolactin exerts a feedback suppression of the firing rates of dopaminergic neurons, a direct effect. There's also a longer-term suppression of the gonadotropic axis by prolactin, if I remember correctly. L-5-hydroxytryptophan (L-5-HTP) has the false transmitter issue, meaning that it can be taken up by dopaminergic axon terminals, decarboxylated into serotonin by aromatic L-amino acid decarboxylase, which nonselectively decarboxylates either L-dopa or L-5-HTP, and then stored in vesicles for release. Upon release, the stored serotonin basically doesn't do anything but effectively reduces the amount of dopamine that's released (like a low-level reserpine-like effect). That can occur, but it may only occur at higher doses. I don't know the dose-response information on that "false transmitter" effect of L-5-HTP. Serotonergic substances, like tryptophan, can also indirectly inhibit dopaminergic activity, via serotonergic inputs to neurons in the ventral tegmental area or nucleus accumbens (I think it's mainly the VTA neurons). Given that dopaminergic activity is among the most potent GH-releasing factors, this seems like it could be an issue, in my opinion, at high dosages. The decreases in the firing rates of VTA neurons that could result from chronic increases in the release of serotonin from neurons providing inputs to the VTA neurons would not be expected to directly influence GH release, but the point is that the blunting of dopaminergic transmission, by an increase in serotonin availability, over the long term, could be generalized to pathways other than those involving the VTA.
Arginine's mechanism isn't known all that well, in spite of the vast amount of research on it. It releases GH by inhibiting somatostatin release, and anticholinergic drugs can block arginine-induced GH release. But that's not saying what the mechanism is. But the release of GH by clonidine, which is an alpha2-adrenoreceptor agonist and decreases the firing rates of noradrenergic neurons, at least in the locus ceruleus, can be enhanced by increasing cholinergic transmission [Cordido et al., 1990: (http://www.ncbi.nlm.nih.gov/pubmed/2159483)]. That actually makes sense to me, but other interactions of clonidine are really paradoxical. Exercise increases GH release by both cholinergic and adrenergic mechanisms. It's mainly adrenergic. But arginine itself is, in my opinion, not cholinergic per se, but arginine may act on neurons that receive cholinergic inputs or something. Arginine may be nitrergic or something, and I did a quick search to see where the research has gone in the past 30 years, since the books on this topic came out. An increase in neuronal nitric oxide release could be one mechanism, and other articles have shown that arginine can release dopamine in the striatum (i.e. from ventral tegmental area neurons projecting to the ventral striatum). Given that effect, which is likely to be fairly indirect, it's conceivable that arginine could release dopamine from dopaminergic neurons in the arcuate nucleus and thereby produce dopaminergically-mediated GH release (even via nitrergic inputs to the arcuate nucleus or ventromedial hypothalamus). But the cholinergic dependence of the GH release is probably polysynaptic and really complicated, because I really doubt that arginine releases acetylcholine to any significant extent. It also could just elevate the glutamate pool or be converted into tricarboxylic acid cycle intermediates and increase cAMP levels, transiently, in neurons in the ventromedial hypothalamus, etc. The arcuate nucleus (in the mediobasal hypothalamus) and the ventromedial hypothalamus are among the neuronal groups that express growth hormone releasing hormone (GHRH) in the brain, etc. But supposedly arginine inhibits somatostatin release, implying that it would act on neurons whose cell bodies are in or whose axons project to the paraventricular nucleus or periventricular nucleus of the hypothalamus (cell groups that express somatostatin and regulate hypothalamic GHRH release, etc.). The neuroanatomy is extremely, extremely complex, and I don't have the stomach for trying to map it out, especially given the unintelligible and bizarre quality of much of the research in this area.
I'm just providing this information because of the haphazard quality of so much of the research. Researchers used to say that one would need to take some measures to enhance one's insulin sensitivity, for GH release to occur, even nocturnally. In any case, I'm not really up for going through all the stuff on this, because the information has been around for 30 years. Many people seem to have forgotten it or to never have been aware of it, however. One can't just choose arbitrary, tiny dosages and take them at random times, after meals, and expect any GH to be released. There's also this new research showing a 4-6 hour refractory period, due to autoinhibition of GH release by GH itself, following maximal GH release. That's sort of less certain, but the idea is that a person couldn't release GH in the morning and then expect a large effect from exercise, unless the exercise occurred between 4-6 hours later. In any case, the other window is before bed, and the idea is to finish eating at least 3-4 hrs before bed, which most people do. The idea is that insulin and glucose are going to be elevated all day, even between meals, and will inhibit GH release to all stimuli except exercise, in many cases. The plasma amino acids will also be elevated after and between meals, all day. Some of these will compete with arginine or even glutamine for entry into the brain, etc. This is the information that's been well-known for thirty years, but, in any case, I've barely scratched the surface of it. It's old news, but it's the type of thing that one can be aware of and just, essentially, know about.
I'll just put this information up here, because, in spite of the polarized quality of the articles, in which GH release frequently tends to be either demonized or viewed as a panacea, GH can sometimes have vasculoprotective effects and can increase exercise tolerance in people who have difficulty exercising, etc. The last time I read about this in any detail was quite awhile ago, and this will be hastily-done. But, in any case, GH is a double-edged sword, though, in terms of its effects on insulin sensitivity. If a person has an abnormally low degree of GH release, such as in response to high-intensity exercise or sleep, as in "GH deficiency," normalizing or increasing GH release can increase insulin sensitivity. Under other circumstances, the GH-induced increases in free fatty acids (FFAs) is thought to be capable of worsening insulin sensitivity. One mechanism is that the increases in FFAs can impair either glucose oxidation or glucose transport, etc., and thereby produce insulin resistance. That's one reason one should discuss this with his or her doctor before using these types of methods for inducing GH release. It's generally a good idea to be exercising, also, with this type of thing. That's the usual advice, but one relevant effect of exercise is to decrease fasting triglyceride levels or to offset any potential for the insulin resistance that can result from GH-induced elevations in FFAs, etc. Exercise improves insulin sensitivity by many other mechanisms, too, and fasting hyperglycemia will tend to blunt the effects of these types of approaches.
People seem to have forgotten, in many of the articles, some of the basic premises that one has to consider to use GH releasers. If one were going to use them, there would be two or three windows, during the day, at which time they would probably be most effective, in my opinion. For GH release to occur, in the absence of some stimulus (such as sleep or high-intensity exercise), insulin and glucose levels, in particular, have to be low. Free fatty acids also should be low, and there's evidence that the nocturnal and exercise-induced pulsatile releases of GH are attenuated or limited ("brought to a halt" or cut short, etc.) by the GH-induced elevations in FFAs. So the morning, before breakfast, is one time at which many of the researchers have administered GH releasers under experimental conditions. The glucose and insulin levels are very low (the lowest at any point in the day, as far as I know, under normal circumstances), and this makes the fasted state ideal for those reasons. The FFAs, on the other hand, tend to be highest in the morning and are very high at night, in part, as researchers have suggested in many articles, because of the nocturnal GH release that occurs about 90 minutes into sleep. That's one reason something like L-glutamine, which, at a dose of 2-3 grams, lowered FFAs in association with GH release [Welbourne, 1995: (http://www.ncbi.nlm.nih.gov/pubmed/7733028)] would, in my opinion, be one approach for taking advantage of that window, especially in combination with L-arginine. The GH release that occurs in response to arginine is inhibited by elevations in FFAs, meaning that glutamine could reasonably be expected to enhance arginine-induced GH release. It's bizarre to me that hundreds of articles are reporting on using massive doses of L-glutamine (the usual dose in many studies is 30-50 grams per day), and only one or two studies have evaluated glutamine use in the context of GH release in humans. There's one other article, besides that article by Welbourne (1995), that used another bizarrely low dose of glutamine in combination with glycine and niacin, I think. Glycine, especially, in my opinion, would be a really bad choice as a GH releaser, because it's been implicated in uremic encephalopathy and is basically an excitatory neurotransmitter, for all practical purposes, whose entry into the brain is almost unregulated. This is in contrast to glutamate and glutamine, whose uptake from the blood into the brain is limited and highly regulated. An increase in the intracellular glutamine or glutamate concentration in cerebral microvascular endothelial cells, in the blood vessels lining the brain, can decrease glutamate or glutamine efflux, but the entry of glutamine or glutamate into the brain is, in my opinion, much more highly-regulated than the entry of glycine into the brain is. I read one article stating that glycine has a 20-hour half-life in the brain, and glycine encephalopathy, due to hyperglycinemia, can occur in all sorts of conditions. In any case, one window is in the fasted state, and one would have to wait about 60-90 minutes, at least, for the GH release to occur. Then, the IGF-1 production, to the extent that it occurs, would occur postprandially. The fat oxidation would also occur postprandially, etc. I'm not making any value-laden statement by providing this information, in any case.
The dosage range for arginine is 6-9 grams, and there's a recent article in Growth Hormone and IGF Research showing that dosages in excess of 9 grams (13 grams was used in the article) produce dose-limiting side effects of stomach upset, etc. The dosage range for glutamine is large, and I don't know what to say on that. In my recent postings, I noted the potential for glutamine to increase phosphate excretion. The increases in plasma bicarbonate that can occur in response to the glutamine-induced urinary acidification, which also tends to concomitantly increase phosphate excretion, also tend to acutely decrease the serum ionized total calcium and have been shown to increase urinary calcium excretion. The potential effect on ionized calcium is a well-known mechanism and occurs because, if I remember, the carboxyl groups on residues of the circulating albumin molecules become deprotonated and bind calcium more effectively. This decreases free (ionized) calcium, at least transiently. I don't know if these effects would always occur or not, but those would be some reasons to be especially careful to talk to one's doctor. Anyway, I don't have time to go through everything, but glutamine also has been shown to elevate plasma arginine and citrulline and alanine and sometimes glutamate (in addition to glutamine). The elevation in citrulline is, according to a wide consensus among researchers, the result of the metabolism of glutamine in the small intestine. The citrulline is then converted into arginine in the kidneys, and the kidneys are normally the main source of blood-borne arginine. The glutamine-induced decrease in FFAs is also probably, in my opinion, one mechanism of GH release. Niacin and the niacin analogue, acipimox, both increase GH by acutely decreasing plasma FFAs. If one looks at the graph of the glutamine-induced decrease in FFAs in this article (about 17 grams, the dose works out to be, and that sounds like sort of a high single dose) [Iwashita et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16517950)], one sees that the graph looks very similar to the one in the Welbourne (1995) article I cited, showing GH release in the context of a glutamine-induced decrease in FFAs (that is followed by an upswing and shows an increase in FFAs). The idea is that the decrease in FFAs, in part, produces GH release, and the GH then increases FFAs. In any case, the glutamine-induced elevations in plasma arginine would also be expected to contribute to the GH release. That article didn't mention GH release, but it's possible, with that high dose of glutamine, that it occurred, even during a meal. Some amino acids and other compounds, such as L-tryptophan, I think, and dopaminergic drugs (L-dopa, for example, which would not be useful, really, in my opinion, and high-intensity exercise), can induce GH release during meals or between meals. L-tryptophan has some theoretical issues with it, and they're saying the increases in metabolites along the kynurenine pathway could produce milder forms of eosinophilic myalgias, etc. It seems a little hard to believe, in my opinion, but it might be one reason to use the lower range of dosages on that. I haven't read about the dosage range for L-tryptophan, but I remember seeing some recent articles in relation to GH release. The potential for increases in prolactin, from elevations in serotonin release by tryptophan, would be one thing I'd be concerned with. Prolactin exerts a feedback suppression of the firing rates of dopaminergic neurons, a direct effect. There's also a longer-term suppression of the gonadotropic axis by prolactin, if I remember correctly. L-5-hydroxytryptophan (L-5-HTP) has the false transmitter issue, meaning that it can be taken up by dopaminergic axon terminals, decarboxylated into serotonin by aromatic L-amino acid decarboxylase, which nonselectively decarboxylates either L-dopa or L-5-HTP, and then stored in vesicles for release. Upon release, the stored serotonin basically doesn't do anything but effectively reduces the amount of dopamine that's released (like a low-level reserpine-like effect). That can occur, but it may only occur at higher doses. I don't know the dose-response information on that "false transmitter" effect of L-5-HTP. Serotonergic substances, like tryptophan, can also indirectly inhibit dopaminergic activity, via serotonergic inputs to neurons in the ventral tegmental area or nucleus accumbens (I think it's mainly the VTA neurons). Given that dopaminergic activity is among the most potent GH-releasing factors, this seems like it could be an issue, in my opinion, at high dosages. The decreases in the firing rates of VTA neurons that could result from chronic increases in the release of serotonin from neurons providing inputs to the VTA neurons would not be expected to directly influence GH release, but the point is that the blunting of dopaminergic transmission, by an increase in serotonin availability, over the long term, could be generalized to pathways other than those involving the VTA.
Arginine's mechanism isn't known all that well, in spite of the vast amount of research on it. It releases GH by inhibiting somatostatin release, and anticholinergic drugs can block arginine-induced GH release. But that's not saying what the mechanism is. But the release of GH by clonidine, which is an alpha2-adrenoreceptor agonist and decreases the firing rates of noradrenergic neurons, at least in the locus ceruleus, can be enhanced by increasing cholinergic transmission [Cordido et al., 1990: (http://www.ncbi.nlm.nih.gov/pubmed/2159483)]. That actually makes sense to me, but other interactions of clonidine are really paradoxical. Exercise increases GH release by both cholinergic and adrenergic mechanisms. It's mainly adrenergic. But arginine itself is, in my opinion, not cholinergic per se, but arginine may act on neurons that receive cholinergic inputs or something. Arginine may be nitrergic or something, and I did a quick search to see where the research has gone in the past 30 years, since the books on this topic came out. An increase in neuronal nitric oxide release could be one mechanism, and other articles have shown that arginine can release dopamine in the striatum (i.e. from ventral tegmental area neurons projecting to the ventral striatum). Given that effect, which is likely to be fairly indirect, it's conceivable that arginine could release dopamine from dopaminergic neurons in the arcuate nucleus and thereby produce dopaminergically-mediated GH release (even via nitrergic inputs to the arcuate nucleus or ventromedial hypothalamus). But the cholinergic dependence of the GH release is probably polysynaptic and really complicated, because I really doubt that arginine releases acetylcholine to any significant extent. It also could just elevate the glutamate pool or be converted into tricarboxylic acid cycle intermediates and increase cAMP levels, transiently, in neurons in the ventromedial hypothalamus, etc. The arcuate nucleus (in the mediobasal hypothalamus) and the ventromedial hypothalamus are among the neuronal groups that express growth hormone releasing hormone (GHRH) in the brain, etc. But supposedly arginine inhibits somatostatin release, implying that it would act on neurons whose cell bodies are in or whose axons project to the paraventricular nucleus or periventricular nucleus of the hypothalamus (cell groups that express somatostatin and regulate hypothalamic GHRH release, etc.). The neuroanatomy is extremely, extremely complex, and I don't have the stomach for trying to map it out, especially given the unintelligible and bizarre quality of much of the research in this area.
I'm just providing this information because of the haphazard quality of so much of the research. Researchers used to say that one would need to take some measures to enhance one's insulin sensitivity, for GH release to occur, even nocturnally. In any case, I'm not really up for going through all the stuff on this, because the information has been around for 30 years. Many people seem to have forgotten it or to never have been aware of it, however. One can't just choose arbitrary, tiny dosages and take them at random times, after meals, and expect any GH to be released. There's also this new research showing a 4-6 hour refractory period, due to autoinhibition of GH release by GH itself, following maximal GH release. That's sort of less certain, but the idea is that a person couldn't release GH in the morning and then expect a large effect from exercise, unless the exercise occurred between 4-6 hours later. In any case, the other window is before bed, and the idea is to finish eating at least 3-4 hrs before bed, which most people do. The idea is that insulin and glucose are going to be elevated all day, even between meals, and will inhibit GH release to all stimuli except exercise, in many cases. The plasma amino acids will also be elevated after and between meals, all day. Some of these will compete with arginine or even glutamine for entry into the brain, etc. This is the information that's been well-known for thirty years, but, in any case, I've barely scratched the surface of it. It's old news, but it's the type of thing that one can be aware of and just, essentially, know about.
Friday, February 20, 2009
Potential "Psychiatric Pitfalls" in the Context of Antiaging and Sports Nutrition, etc.
The authors of this article [Tamminga et al., 1977: (http://www.ncbi.nlm.nih.gov/pubmed/869054)] discuss the fact that choline itself, administered as free choline (as a choline salt, such as choline chloride, etc.), or choline-containing compounds, such as phosphatidylcholine (lecithin) or alpha-glycerophosphocholine (GPC), caused severe depression in two patients in their study. The authors note that excessive cholinergic activity, such as can be induced by drugs that inhibit the breakdown of acetylcholine (ACh), which is produced from choline in the central nervous system (CNS), can produce depression, and this phenomenon is well-known [Dilsaver, 1986: (http://www.ncbi.nlm.nih.gov/pubmed/2435781); Dilsaver, 1986: (http://www.ncbi.nlm.nih.gov/pubmed/3517080); Overstreet and Djuric, 2001: (http://www.ncbi.nlm.nih.gov/pubmed/12000038); Fritze et al., 1988: (http://www.ncbi.nlm.nih.gov/pubmed/3244788)]. Those aren't perfect examples, but cholinergic drugs or ACh precursors, such as choline, tend to antagonize dopaminergic and noradrenergic/adrenergic transmission in the brain. That's an imprecise statement, but there's a vast amount of evidence to show that it's the case. In Parkinson's disease, for example, cholinergic drugs worsen symptoms, and anticholinergic drugs tend to enhance dopaminergic activity and relieve involuntary motor symptoms, etc. The monoaminergic-cholinergic antagonism tends to occur in areas of the brain that regulate both motor control and affective or cognitive functioning. In any case, people seem to have forgotten this phenomenon or to not be aware of it, because choline is added to all sorts of supplements. In my opinion, choline, in the absence of some evidence that a person has a deficit in cholinergic transmission in the brain, is more likely to produce depressive symptoms than it is to enhance cognitive functioning or produce some other nonspecific "brain-boosting" effect. Dopaminergic and noradrenergic activity are, in my opinion, much more important for the "everyday" cognitive functions, such as in the context of the "working memory" that depends on the activities of neurons in the prefrontal cortex, than cholinergic activity is.
This is an oversimplification, but there are some disturbing articles on the effects that choline can have. Lyoo et al. (2003) [Lyoo et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/12895208)] found that oral choline caused a generalized depletion of purines from the brains of people with bipolar disorder, and the authors had expected this to occur. Purine depletion has been associated with major depression in some people (Renshaw et al., 2001, cited in link), and Renshaw et al. (2001) suggested that the replenishment of adenosine by S-adenosylmethionine could be a mechanism by which S-adenosylmethionine can relieve depression [see here for Renshaw, 2001: (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html)]. I probably don't need to say it, but, in my opinion, those facts provide more evidence of the potential for choline to worsen mood, etc. The idea that such a drastic depletion of purines would only occur in people who have been diagnosed with a psychiatric condition makes, in my opinion, no sense at all. In a similar vein, Carlezon et al. (2002) [Carlezon et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/12022961)] found that choline produced effects, in an animal model of depression, consistent with a worsening of depression. When the researchers gave the rats cytidine-5'-diphosphocholine (a.k.a. citicoline, citicholine, CDP-choline, etc.), there was no net effect in the model of depression. Carlezon et al. (2002) found, however, that cytidine, by itself, produced an antidepressant effect. This is consistent with the articles showing antidepressant effects of uridine in animal models [one example: Carlezon et al., 2005: (http://www.ncbi.nlm.nih.gov/pubmed/15705349)]. (They might just try leaving out the omega-3 fatty acids, in my opinion, but that's not relevant to the discussion.) The article by Carlezon et al. (2002) shows, in my opinion, that the choline that's liberated from CDP-choline (it's hydrolyzed into choline and cytidine in the intestinal tract and elevates plasma uridine, but not cytidine, in humans) may not do much of anything except block the antidepressant effect of cytidine. CDP-choline continues to be talked-up in the literature, and no one seems to be aware of the fact that using cytidine by itself would, in my opinion, be a much more sensible approach to increasing phosphatidylcholine levels in the brain (in comparison to cytidine and choline together).
I was going to mention the potential that exists, in my opinion, for psychiatric symptoms to occur in response to high doses of branched-chain amino acid (BCAA) supplements (which are usually provided as combinations of free-form L-leucine, L-isoleucine, and L-valine), such as are used to treat catabolic states or to support athletic performance, but I don't have time to go into detail with that. BCAAs compete with tryptophan and tyrosine or phenylalanine for entry into the brain, but no one studying BCAAs seems to be aware of all the vast amounts of research showing depression or worsening of cognitive functioning in response to acute, experimental brain tyrosine depletion (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=tyrosine+depletion) or tryptophan depletion (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=tryptophan+depletion) (both experimental procedures, in which people are given amino acid mixtures that lack either tyrosine or tryptophan, can cause severe depression or worsen cognitive functioning, etc., by depleting dopamine and noradrenaline, in the case of tyrosine or phenylalanine depletion, or by depleting serotonin, in the case of tryptophan depletion). Researchers have suggested that BCAA supplementation could be used to reduce "central fatigue" during exercise (http://scholar.google.com/scholar?q=BCAA+%22central+fatigue%22&hl=en&lr=) (the contribution of a reduction or alteration in brain functioning to the perception or actual extent of muscular fatigue), and an increase in tryptophan uptake is thought to contribute to central fatigue (hence the idea of researchers that the use of BCAAs will supposedly "energize" a person by depleting tryptophan). BCAAs have also been researched for their supposed neuroprotective or cognition-enhancing effects (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22branched+chain+amino%22+neuroprotective+OR+amyotrophic+OR+mania), but those applications (and the use in the context of central fatigue, for example) are, in my opinion, not good ideas.
This is an oversimplification, but there are some disturbing articles on the effects that choline can have. Lyoo et al. (2003) [Lyoo et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/12895208)] found that oral choline caused a generalized depletion of purines from the brains of people with bipolar disorder, and the authors had expected this to occur. Purine depletion has been associated with major depression in some people (Renshaw et al., 2001, cited in link), and Renshaw et al. (2001) suggested that the replenishment of adenosine by S-adenosylmethionine could be a mechanism by which S-adenosylmethionine can relieve depression [see here for Renshaw, 2001: (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html)]. I probably don't need to say it, but, in my opinion, those facts provide more evidence of the potential for choline to worsen mood, etc. The idea that such a drastic depletion of purines would only occur in people who have been diagnosed with a psychiatric condition makes, in my opinion, no sense at all. In a similar vein, Carlezon et al. (2002) [Carlezon et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/12022961)] found that choline produced effects, in an animal model of depression, consistent with a worsening of depression. When the researchers gave the rats cytidine-5'-diphosphocholine (a.k.a. citicoline, citicholine, CDP-choline, etc.), there was no net effect in the model of depression. Carlezon et al. (2002) found, however, that cytidine, by itself, produced an antidepressant effect. This is consistent with the articles showing antidepressant effects of uridine in animal models [one example: Carlezon et al., 2005: (http://www.ncbi.nlm.nih.gov/pubmed/15705349)]. (They might just try leaving out the omega-3 fatty acids, in my opinion, but that's not relevant to the discussion.) The article by Carlezon et al. (2002) shows, in my opinion, that the choline that's liberated from CDP-choline (it's hydrolyzed into choline and cytidine in the intestinal tract and elevates plasma uridine, but not cytidine, in humans) may not do much of anything except block the antidepressant effect of cytidine. CDP-choline continues to be talked-up in the literature, and no one seems to be aware of the fact that using cytidine by itself would, in my opinion, be a much more sensible approach to increasing phosphatidylcholine levels in the brain (in comparison to cytidine and choline together).
I was going to mention the potential that exists, in my opinion, for psychiatric symptoms to occur in response to high doses of branched-chain amino acid (BCAA) supplements (which are usually provided as combinations of free-form L-leucine, L-isoleucine, and L-valine), such as are used to treat catabolic states or to support athletic performance, but I don't have time to go into detail with that. BCAAs compete with tryptophan and tyrosine or phenylalanine for entry into the brain, but no one studying BCAAs seems to be aware of all the vast amounts of research showing depression or worsening of cognitive functioning in response to acute, experimental brain tyrosine depletion (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=tyrosine+depletion) or tryptophan depletion (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=tryptophan+depletion) (both experimental procedures, in which people are given amino acid mixtures that lack either tyrosine or tryptophan, can cause severe depression or worsen cognitive functioning, etc., by depleting dopamine and noradrenaline, in the case of tyrosine or phenylalanine depletion, or by depleting serotonin, in the case of tryptophan depletion). Researchers have suggested that BCAA supplementation could be used to reduce "central fatigue" during exercise (http://scholar.google.com/scholar?q=BCAA+%22central+fatigue%22&hl=en&lr=) (the contribution of a reduction or alteration in brain functioning to the perception or actual extent of muscular fatigue), and an increase in tryptophan uptake is thought to contribute to central fatigue (hence the idea of researchers that the use of BCAAs will supposedly "energize" a person by depleting tryptophan). BCAAs have also been researched for their supposed neuroprotective or cognition-enhancing effects (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22branched+chain+amino%22+neuroprotective+OR+amyotrophic+OR+mania), but those applications (and the use in the context of central fatigue, for example) are, in my opinion, not good ideas.
Wednesday, February 18, 2009
Potential for Hypophosphatemia or Phosphate Depletion From Some Amino or Keto Acids
This article [Zimmermann et al., 1996: (http://www.ncbi.nlm.nih.gov/pubmed/8676818)] is really interesting, and the authors discuss and cite research showing that alpha-ketoglutarate and other keto acids (and even free-form amino acids, under some circumstances) can significantly decrease serum phosphate and parathyroid hormone (PTH) levels. I was really surprised by this, but there are dozens of articles showing the same thing. The authors of some of the older articles attributed the effect of the keto acids or amino acids, which were used with the intention of reducing either serum PTH or serum phosphate levels or both, in people with kidney disease, to the phosphate-binding effect of the calcium in calcium salts of keto acids (a.k.a. ketoacids). But it's pretty clear that it's not the calcium, primarily, because several articles show the effect can occur in response to the intake of ornithine alpha-ketoglutarate or other amino acids or keto acids (without any increase in calcium intake) [here's one of them: Combe and Aparicio, 1994: (http://www.ncbi.nlm.nih.gov/pubmed/7853797)]. I don't have time to put the rest up now, but I will shortly. In most of the articles, the authors have attributed it to the capacity of the amino acids to drive extracellular phosphate into cells, such as in association with increases in nucleotide biosynthesis and DNA replication, etc. It's similar to the hypophosphatemia that can occur in the "refeeding syndrome" (a disturbance in the homeostasis of phosphate or sodium or potassium, etc., in response to an acute increase in nutriture, basically, although that definition is imprecise: http://scholar.google.com/scholar?num=100&hl=en&lr=&q=refeeding+hypophosphatemia), and I found some other articles discussing hypophosphatemia in response to increases in the amino acid intakes of people on total parenteral nutrition (such as in some disease state that necessitates the intake of food by some other route than enterally, or orally) [Lamiell et al., 1990: (http://www.ncbi.nlm.nih.gov/pubmed/2108005)]. In that case, the authors more or less definitively established that the amino acids per se were causing hypophosphatemic encephalopathy (abnormal state of consciousness, such as confusion or extreme fatigue or drowsiness or dementia) in a man.
The message that comes out of a lot of the articles, taken together, is, in my opinion, that disease states or physical trauma or muscle wasting, etc., increases the likelihood that a person could develop hypophosphatemia or another electrolyte abnormality (or a milder depletion of serum phosphate). What's interesting and unfortunate is that these are the very disease states in which L-glutamine and arginine have been used therapeutically. The authors of this article
[Maier-Dobersburger et al., 1994: (http://www.ncbi.nlm.nih.gov/pubmed/8201756)] refer to the fact that physical trauma or muscle wasting and undernutrition can increase the requirements of cells for "growth-promoting" substrates [(http://scholar.google.com/scholar?num=100&hl=en&lr=&q=catabolic+glutamine+OR+ketoglutarate); (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=wasting+glutamine+OR+ketoglutarate)], which glutamine would be an example of, and cause the normal phosphate intake to become inadequate. This increase in the requirement for glutamine or arginine or exogenous nucleotides makes the use of those compounds rational, but the message that comes from the articles on the hypophosphatemic effects that can occur, as in response to supplementation with alpha-ketoglutarate and other amino acids or keto acids, is that the amino acids could then, in my opinion, exacerbate the hypophosphatemia in some people.
In my opinion, high doses of glutamine or conceivably arginine or other amino acids could decrease the serum phosphate in people who are not suffering from obvious disease states, particularly under certain conditions. For example, Maier-Dobersburger et al. (1994) noted that low vitamin D status has been associated with hypophosphatemia, and I think low magnesium intake is associated with low serum phosphate, also. But I think the potential hypophosphatemic effects of some of these amino acids or keto acids might have something to do with an increase in the excretion of phosphate by the kidneys, such as in response to changes in the acid-base balance. There are at least two articles showing that the ingestion of only 2 grams of L-glutamine can increase serum bicarbonate [Welbourne, 1995: (http://www.ncbi.nlm.nih.gov/pubmed/7733028); Welbourne et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9537612) (http://www.ajcn.org/cgi/reprint/67/4/660.pdf)], and the authors suggested that the effect might have occurred in response to the growth-hormone release that glutamine had induced. Glutamine is used as a growth hormone releaser and was shown to release growth hormone in one of those articles (Welbourne, 1995), but I don't think that's the main mechanism for the increase in serum bicarbonate in response to glutamine. Welbourne et al. (1998) discuss the fact that the breakdown of glutamine, into glutamine and ultimately alpha-ketoglutarate or glucose (by glutaminase and either glutamate dehydrogenase or aspartate aminotransferase, respectively) in the kidneys, is an important mechanism whereby ammonia, derived from glutamine, can be excreted as the ammonium cation (NH4+) and thereby acidify the urine (in response to metabolic acidosis). Bichara et al. (1990) found that an increase in the urinary loss of phosphate occurred in concert with an increase in the urinary acidification, produced by ammonia excretion (as the ammonium cation), and part of this increase in the acidification of the urine was the result of an increase in serum parathyroid hormone (PTH). The PTH had increased in response to metabolic acidosis, and the extra PTH contributed to the phosphaturia (loss of phosphate and ammonia in the urine). It's long been known that, all other things being equal, elevations in PTH increase phosphate excretion, but the interesting thing is the association with ammonia excretion and with acidosis per se. In "milk alkali syndrome," the high amounts of phosphate and calcium in cow's milk can lead to metabolic alkalosis (when a person drinks excessive amounts of milk, and the syndrome usually occurs in very young children, as far as I know).
The relevance of this is that, in my opinion, the metabolism of glutamine, as opposed to alpha-ketoglutarate, in the kidneys, with relevance to the potential for phosphate excretion to be "coupled" to urinary acidification by glutamine-derived ammonium, seems like it would be subject to more regulatory mechanisms than the metabolism of alpha-ketoglutarate (and could, in my opinion, be somewhat less likely to produce hypophosphatemia than alpha-ketoglutarate would be. Phosphate depletion can occur during endurance events, in athletes, and this type of thing could happen in a person starting an exercise program, etc. It would be wise for a person to monitor his or her serum phosphate levels, under a doctor's supervision, when starting any of these supplements. And one should discuss, with his or her doctor, any risk factors that he or she might have for the development of electrolyte disturbances. I don't have time to go into the other relevant points, but glutamine metabolism in the kidneys is very different from glutamine metabolism in the liver and in other organs [Baverel and Lund, 1979: (http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1161843)]. Also, some articles discuss the fact that exogenous alpha-ketoglutarate would, as expected, enter the cytosol in the cells of the kidneys, and cytosolic alpha-ketoglutarate produces different regulatory effects, in comparison to glutamine itself, on glutamine metabolism and, for example, dicarboxylic acid uptake from the blood into cells in the kidneys. Some articles discuss the fact that not much alpha-ketoglutarate is likely to be converted into glutamine, even though alpha-ketoglutarate can sometimes "spare" glutamine breakdown. This fact is relevant in the context of the kidneys, in particular.
Taking phosphate supplements without a doctor's supervision, especially at anything resembling a high dose, is, in my opinion, unwise and potentially dangerous. I don't know if they're even available, because (in response to the black-box warning, by the FDA, of the risk for acute renal failure due to high-dose phosphate ingestion, to prepare for some medical procedures, in some people) some companies withdrew their over-the-counter phosphate-containing products sometime last year. The doses of phosphate associated with acute renal failure, due to acute phosphate nephropathy (http://scholar.google.com/scholar?q=acute+phosphate+nephropathy&hl=en&lr=), tended to be very large (18 grams or more, in some cases, but I only looked at a couple of articles today). In my opinion, the use of something like vitamin D supplementation or "milk intake" (apparently milk has a high level of phosphate in some unusual form, complexed with casein and calcium in colloidal form), to alter one's phosphate absorption or excretion, would be a safer approach. I mentioned some of the issues with vitamin D supplementation in past postings. Magnesium has really complex interactions with PTH and phosphate metabolism, and I can't go into those now.
But the problem with the approach of increasing one's phosphate intake would, in my opinion, be that an acute increase in phosphate intake will tend to increase PTH release, thereby exacerbating any PTH-mediated urinary phosphate loss (Combe and Aparicio, 1994, cited above, discuss that). An increase in the serum 25-hydroxyvitamin D level will decrease PTH levels [this effect, which is linear up to like 200 nM of 25(OH)D, simply cannot be due to more and more renal 1alpha,25-dihydroxyvitamin D (hormonal vitamin D) production, because the production of hormonal vitamin D plateaus above a certain 25(OH)D level] and reduce urinary phosphate loss by that mechanism, as far as I know. The metabolic effects of abnormally-high hormonal vitamin D are very different, in the absence of a granulomatous disease, than the effects of elevated 25(OH)D levels (within the range of "safe" levels). Some articles have shown serum 25(OH)D levels to be roughly-inversely associated with serum hormonal vitamin D levels, over a certain range of 25(OH)D levels (probably reflecting a reduction in the output of macrophage-derived hormonal vitamin D or something, etc.). I think low magnesium intake can decrease serum phosphate, but increasing magnesium can increase PTH responsiveness and increase PTH output, in some cases. The situation is really complicated with magnesium.
The message that comes out of a lot of the articles, taken together, is, in my opinion, that disease states or physical trauma or muscle wasting, etc., increases the likelihood that a person could develop hypophosphatemia or another electrolyte abnormality (or a milder depletion of serum phosphate). What's interesting and unfortunate is that these are the very disease states in which L-glutamine and arginine have been used therapeutically. The authors of this article
[Maier-Dobersburger et al., 1994: (http://www.ncbi.nlm.nih.gov/pubmed/8201756)] refer to the fact that physical trauma or muscle wasting and undernutrition can increase the requirements of cells for "growth-promoting" substrates [(http://scholar.google.com/scholar?num=100&hl=en&lr=&q=catabolic+glutamine+OR+ketoglutarate); (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=wasting+glutamine+OR+ketoglutarate)], which glutamine would be an example of, and cause the normal phosphate intake to become inadequate. This increase in the requirement for glutamine or arginine or exogenous nucleotides makes the use of those compounds rational, but the message that comes from the articles on the hypophosphatemic effects that can occur, as in response to supplementation with alpha-ketoglutarate and other amino acids or keto acids, is that the amino acids could then, in my opinion, exacerbate the hypophosphatemia in some people.
In my opinion, high doses of glutamine or conceivably arginine or other amino acids could decrease the serum phosphate in people who are not suffering from obvious disease states, particularly under certain conditions. For example, Maier-Dobersburger et al. (1994) noted that low vitamin D status has been associated with hypophosphatemia, and I think low magnesium intake is associated with low serum phosphate, also. But I think the potential hypophosphatemic effects of some of these amino acids or keto acids might have something to do with an increase in the excretion of phosphate by the kidneys, such as in response to changes in the acid-base balance. There are at least two articles showing that the ingestion of only 2 grams of L-glutamine can increase serum bicarbonate [Welbourne, 1995: (http://www.ncbi.nlm.nih.gov/pubmed/7733028); Welbourne et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9537612) (http://www.ajcn.org/cgi/reprint/67/4/660.pdf)], and the authors suggested that the effect might have occurred in response to the growth-hormone release that glutamine had induced. Glutamine is used as a growth hormone releaser and was shown to release growth hormone in one of those articles (Welbourne, 1995), but I don't think that's the main mechanism for the increase in serum bicarbonate in response to glutamine. Welbourne et al. (1998) discuss the fact that the breakdown of glutamine, into glutamine and ultimately alpha-ketoglutarate or glucose (by glutaminase and either glutamate dehydrogenase or aspartate aminotransferase, respectively) in the kidneys, is an important mechanism whereby ammonia, derived from glutamine, can be excreted as the ammonium cation (NH4+) and thereby acidify the urine (in response to metabolic acidosis). Bichara et al. (1990) found that an increase in the urinary loss of phosphate occurred in concert with an increase in the urinary acidification, produced by ammonia excretion (as the ammonium cation), and part of this increase in the acidification of the urine was the result of an increase in serum parathyroid hormone (PTH). The PTH had increased in response to metabolic acidosis, and the extra PTH contributed to the phosphaturia (loss of phosphate and ammonia in the urine). It's long been known that, all other things being equal, elevations in PTH increase phosphate excretion, but the interesting thing is the association with ammonia excretion and with acidosis per se. In "milk alkali syndrome," the high amounts of phosphate and calcium in cow's milk can lead to metabolic alkalosis (when a person drinks excessive amounts of milk, and the syndrome usually occurs in very young children, as far as I know).
The relevance of this is that, in my opinion, the metabolism of glutamine, as opposed to alpha-ketoglutarate, in the kidneys, with relevance to the potential for phosphate excretion to be "coupled" to urinary acidification by glutamine-derived ammonium, seems like it would be subject to more regulatory mechanisms than the metabolism of alpha-ketoglutarate (and could, in my opinion, be somewhat less likely to produce hypophosphatemia than alpha-ketoglutarate would be. Phosphate depletion can occur during endurance events, in athletes, and this type of thing could happen in a person starting an exercise program, etc. It would be wise for a person to monitor his or her serum phosphate levels, under a doctor's supervision, when starting any of these supplements. And one should discuss, with his or her doctor, any risk factors that he or she might have for the development of electrolyte disturbances. I don't have time to go into the other relevant points, but glutamine metabolism in the kidneys is very different from glutamine metabolism in the liver and in other organs [Baverel and Lund, 1979: (http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1161843)]. Also, some articles discuss the fact that exogenous alpha-ketoglutarate would, as expected, enter the cytosol in the cells of the kidneys, and cytosolic alpha-ketoglutarate produces different regulatory effects, in comparison to glutamine itself, on glutamine metabolism and, for example, dicarboxylic acid uptake from the blood into cells in the kidneys. Some articles discuss the fact that not much alpha-ketoglutarate is likely to be converted into glutamine, even though alpha-ketoglutarate can sometimes "spare" glutamine breakdown. This fact is relevant in the context of the kidneys, in particular.
Taking phosphate supplements without a doctor's supervision, especially at anything resembling a high dose, is, in my opinion, unwise and potentially dangerous. I don't know if they're even available, because (in response to the black-box warning, by the FDA, of the risk for acute renal failure due to high-dose phosphate ingestion, to prepare for some medical procedures, in some people) some companies withdrew their over-the-counter phosphate-containing products sometime last year. The doses of phosphate associated with acute renal failure, due to acute phosphate nephropathy (http://scholar.google.com/scholar?q=acute+phosphate+nephropathy&hl=en&lr=), tended to be very large (18 grams or more, in some cases, but I only looked at a couple of articles today). In my opinion, the use of something like vitamin D supplementation or "milk intake" (apparently milk has a high level of phosphate in some unusual form, complexed with casein and calcium in colloidal form), to alter one's phosphate absorption or excretion, would be a safer approach. I mentioned some of the issues with vitamin D supplementation in past postings. Magnesium has really complex interactions with PTH and phosphate metabolism, and I can't go into those now.
But the problem with the approach of increasing one's phosphate intake would, in my opinion, be that an acute increase in phosphate intake will tend to increase PTH release, thereby exacerbating any PTH-mediated urinary phosphate loss (Combe and Aparicio, 1994, cited above, discuss that). An increase in the serum 25-hydroxyvitamin D level will decrease PTH levels [this effect, which is linear up to like 200 nM of 25(OH)D, simply cannot be due to more and more renal 1alpha,25-dihydroxyvitamin D (hormonal vitamin D) production, because the production of hormonal vitamin D plateaus above a certain 25(OH)D level] and reduce urinary phosphate loss by that mechanism, as far as I know. The metabolic effects of abnormally-high hormonal vitamin D are very different, in the absence of a granulomatous disease, than the effects of elevated 25(OH)D levels (within the range of "safe" levels). Some articles have shown serum 25(OH)D levels to be roughly-inversely associated with serum hormonal vitamin D levels, over a certain range of 25(OH)D levels (probably reflecting a reduction in the output of macrophage-derived hormonal vitamin D or something, etc.). I think low magnesium intake can decrease serum phosphate, but increasing magnesium can increase PTH responsiveness and increase PTH output, in some cases. The situation is really complicated with magnesium.
Sunday, February 15, 2009
Contribution of Glutamine to the Pool of TCA Cycle Intermediates in Neurons, Cardiac Myocytes, and Other Cell Types
This article [Hodges and Snyder, 2004: (http://www.ncbi.nlm.nih.gov/pubmed/15571244)] is really interesting and cites research showing that L-glutamine is the major energy source for cultured reticulocytes (immature red blood cells) and other cultured cell types and that glutamine + inosine (or other purines) can rescue cultured cells in media that lack glucose. In those articles, it's interesting that pyruvate or uridine or glutamate were not protective against glucose deprivation. In cultured lymphocytes or in lymphocytes in vivo (or in reticulocytes in vivo), as examples of cell types in which the activities of the de novo purine biosynthetic enzymes are very high, exogenous glutamine (i.e. adequate glutamine availability) would be important for de novo purine biosynthesis. But the main point of these articles and of all the articles showing protection against postischemic ATP and phosphocreatine depletion (the PCr depletion occurs in response to ATP depletion) in the heart and other tissues [Khogali et al., 1998: (http://www.ncbi.nlm.nih.gov/pubmed/9602431); Wischmeyer et al., 2004: (http://www.ncbi.nlm.nih.gov/pubmed/14621120); Stottrup et al., 2006: (http://www.ncbi.nlm.nih.gov/pubmed/17042921); (http://scholar.google.com/scholar?num=100&hl=en&lr=&cites=780779484133987619)] is that glutamine is being converted into tricarboxylic acid cycle (TCA) cycle substrates (i.e. entering the TCA cycle as 2-oxoglutarate or another TCA cycle intermediate) and acting as an anaplerotic substrate and "energy source." Many articles discuss the fact that glutamine is the major energy source for cells lining the upper intestine in the fasted state (the cells take up glutamine from the blood). Many articles show that exogenous glutamine improves glycogen and glutathione replenishment following ischemia or metabolic stress (even in the context of exercise, for example), including the ones by Stottrup et al. (2006) and Wischmeyer et al. (2004), respectively.
Everyone's afraid of increasing extracellular glutamate in the brain, but no one seems to consider the fact that the provision of exogenous glutamine could spare significant amounts of ATP (glutamine synthetase is ATP-consuming, and the consumption of ATP at single enzymatic steps begins to become very significant in the context of something like glutamine synthesis). The amounts of glutamine that are synthesized in different tissues are very large. Also, some articles make the erroneous statement that changes in the plasma glutamine concentration, which is, incidentally, drastically depleted following high-intensity exercise, occur independently of changes in the rate of efflux of glutamine or glutamate from the brain interstitial fluid (ISF). This is not true as a general statement but is probably true, under experimental conditions, in O'Kane et al. (1999) cite research showing that glutamate is exported across the luminal membranes of cerebral vascular endothelial cells in response to an intracellular glutamate concentration, in endothelial cells, that exceeds the plasma concentration of glutamate. Glutamate is known to be exported across the abluminal membrane (between the ISF and endothelial cell cytosol) into endothelial cells, glutamine is taken up into endothelial cells from the blood, and glutamate+glutamine efflux increases in response to ischemia or glucose deprivation or anoxia. That means that the plasma glutamate concentration, which is increased in response to exogenous glutamine, can determine the rate of efflux of glutamate from the CNS, at least under conditions that either increase glutamate efflux across the abluminal membranes of endothelial cells or decrease the plasma glutamate concentration (such as high-intensity exercise or muscle-wasting conditions, as in protein-energy malnutrition in chronic disease states). So under conditions of ischemia, maintaining the plasma glutamine concentration could, in my opinion, be expected to help, indirectly, to maintain the pools of TCA cycle intermediates in neurons and astrocytes (by limiting glutamine+glutamate efflux from the brain). The assumption seems to be that glutamine can only be converted into glutamate in cells or in the brain or that all of the glutamine is going to remain as extracellular glutamate. Glutamine is rapidly converted into aspartate and 2-oxoglutarate (alpha-ketoglutarate), etc., and most glutamate in the brain is intracellular. This article shows that exogenous alpha-ketoglutarate (the carbon skeleton of glutamine/glutamate and a TCA cycle intermediate) or pyruvate can be neuroprotective against PARP-activation-induced neuronal loss [Ying et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/12142562)]. I have a lot of other articles on my computer that show the same thing in neurons (increases in or maintenance of ATP levels with combinations of exogenous glutamine or aspartate, along with purines, especially). A person would nonetheless want to talk to his or her doctor before using any supplement. But this is the type of thing that could help explain the exercise intolerance of people in chronic disease states or the neurotrophic effects that voluntary exercise can have, providing the depletions of plasma glutamine are not excessively prolonged.
I used to think, as others with an interest in sports nutrition have thought, that glutamate or alpha-ketoglutarate would be superior to glutamine for various purposes, given that glutamine produces more ammonia than glutamate or alpha-ketoglutarate do (alpha-ketoglutarate contains no nitrogen). In my opinion, based on various articles and arguments I've seen, this is probably not true. In one article I have, the author makes a convincing case that glutamine is superior, by virtue of its ability to "spare" glutamate that would otherwise be used for glutamine biosynthesis (an ATP-consuming process). That, besides the replenishment of TCA cycle intermediates, could account for some of the cell-energy-maintaining effects of exogenous glutamine.
The conversion of glutamine into aspartate or alpha-ketoglutarate (2-oxoglutarate) could also play a role in the supposed ATP-buffering effects of exogenous glutamine. Another "layer" to anaplerosis is that an excess of acetyl-CoA or acyl-CoA's can, by sequestering free CoA, inhibit the alpha-ketoglutarate dehydrogenase (KGDH) reaction, and true anaplerosis, under those conditions, requires that substrates enter the TCA cycle at a point that bypasses the "rate-limited" KGDH step (http://hardcorephysiologyfun.blogspot.com/2009/01/coenzyme-sequestration.html). The changes in the ratios of acetyl-CoA to free CoA or to acyl-CoA's, other than acetyl-CoA, that occur, in myocytes or other cell types, during exercise depend on the intensity and duration of the exercise. Another effect of the glutamine-induced increases in the aspartate pools is that aspartate can help maintain the purine nucleotide cycle in the skeletal muscles during exercise. That's part of the rationale, in my opinion, for thinking about purines in conjunction with glutamine, given that purines can be depleted, to varying degrees, during exercise, from the muscles (and also other tissues, such as the liver and probably, in my view, the brain). There's a "purine nucleotide cycle" for salvaging purines in the brain, too, but it's not clear that it's necessarily as closely connected to the TCA cycle as the purine nucleotide cycle in the skeletal muscle is. The assumption in some articles is that the increase in the uptake of ammonia into the brain, during exercise, is partially responsible for central fatigue. That may be true to some extent, but ammonia has also been shown to increase growth hormone release, for example. There might be some "beneficial" or trophic effects, even if a slight increase in ISF ammonia limits the duration of exercise or produces some fatigue or cell-energy stress in the short term. But my point was that, even with the extra ammonia that glutamine could conceivably provide, glutamine supplementation has been shown to reduce ammonia accumulation, under some conditions. Low doses of arginine (6 grams or less, sometimes 9) are also typically combined with glutamine, given that arginine can increase ammonia disposal in the liver and conceivably antagonize the glutamine-induced increases in plasma bicarbonate, etc.
Morris (2004) [Morris, 2004: (http://jn.nutrition.org/cgi/reprint/134/10/2743S) (http://www.ncbi.nlm.nih.gov/pubmed/15465778?dopt=Abstract)] noted that arginine biosynthesis consumes 2 moles of ATP per mole of citrulline, and this could suggest, in my opinion, that exogenous arginine would be preferable to citrulline. Exogenous glutamine can increase plasma citrulline, which is then converted into arginine in the kidneys. This is the main site at which blood-borne arginine is synthesized. It's not possible to overcome the requirement for renal arginine biosynthesis, though. When people compare arginine and citrulline, the "neutral" influence of citrulline on urea production (given that citrulline has one fewer amine groups than arginine does) is viewed as being advantageous. The argument of Morris (2004), namely that arginine formation consumes ATP, suggests, in my opinion, that arginine would be a better choice. It's true that arginine tends to be taken up by the liver en masse, but the elevations in plasma arginine, in response to exogenous arginine, require at least a couple of weeks, I think, or longer, to emerge. So the short-term elevations of plasma arginine, in response to citrulline, Even though ornithine might appear to be favorable in its capacity to "spare nitrogen," by reducing ammonia levels, and to not produce the dose-limiting side effects that arginine produces (in doses of arginine in excess of 9 g/dose or even 9 g/d), that argument (the idea that ammonia scavenging or less ureagenesis is the be-all and end-all of these types of approaches, to buffering plasma glutamine and TCA cycle intermediates in the skeletal muscles or brain or elsewhere) has carried too much sway and is not really valid, in my opinion, under normal conditions (i.e. in the absence of hyperammonemia, such as in the context of hepatic encephalopathy, induced by hepatic failure or renal failure, etc.). Additionally, ornithine can suppress creatine biosynthesis, but increases in plasma citrulline or arginine have the potential to increase creatine biosynthesis (see link below). That distinction, between arginine and ornithine, is likely, in my opinion, to be especially important in the context of in situ (it's both in situ and de novo) creatine biosynthesis in the brain [(http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html); (http://hardcorephysiologyfun.blogspot.com/2009/01/pyridoxine-folate-cobalamin.html)].
Everyone's afraid of increasing extracellular glutamate in the brain, but no one seems to consider the fact that the provision of exogenous glutamine could spare significant amounts of ATP (glutamine synthetase is ATP-consuming, and the consumption of ATP at single enzymatic steps begins to become very significant in the context of something like glutamine synthesis). The amounts of glutamine that are synthesized in different tissues are very large. Also, some articles make the erroneous statement that changes in the plasma glutamine concentration, which is, incidentally, drastically depleted following high-intensity exercise, occur independently of changes in the rate of efflux of glutamine or glutamate from the brain interstitial fluid (ISF). This is not true as a general statement but is probably true, under experimental conditions, in O'Kane et al. (1999) cite research showing that glutamate is exported across the luminal membranes of cerebral vascular endothelial cells in response to an intracellular glutamate concentration, in endothelial cells, that exceeds the plasma concentration of glutamate. Glutamate is known to be exported across the abluminal membrane (between the ISF and endothelial cell cytosol) into endothelial cells, glutamine is taken up into endothelial cells from the blood, and glutamate+glutamine efflux increases in response to ischemia or glucose deprivation or anoxia. That means that the plasma glutamate concentration, which is increased in response to exogenous glutamine, can determine the rate of efflux of glutamate from the CNS, at least under conditions that either increase glutamate efflux across the abluminal membranes of endothelial cells or decrease the plasma glutamate concentration (such as high-intensity exercise or muscle-wasting conditions, as in protein-energy malnutrition in chronic disease states). So under conditions of ischemia, maintaining the plasma glutamine concentration could, in my opinion, be expected to help, indirectly, to maintain the pools of TCA cycle intermediates in neurons and astrocytes (by limiting glutamine+glutamate efflux from the brain). The assumption seems to be that glutamine can only be converted into glutamate in cells or in the brain or that all of the glutamine is going to remain as extracellular glutamate. Glutamine is rapidly converted into aspartate and 2-oxoglutarate (alpha-ketoglutarate), etc., and most glutamate in the brain is intracellular. This article shows that exogenous alpha-ketoglutarate (the carbon skeleton of glutamine/glutamate and a TCA cycle intermediate) or pyruvate can be neuroprotective against PARP-activation-induced neuronal loss [Ying et al., 2002: (http://www.ncbi.nlm.nih.gov/pubmed/12142562)]. I have a lot of other articles on my computer that show the same thing in neurons (increases in or maintenance of ATP levels with combinations of exogenous glutamine or aspartate, along with purines, especially). A person would nonetheless want to talk to his or her doctor before using any supplement. But this is the type of thing that could help explain the exercise intolerance of people in chronic disease states or the neurotrophic effects that voluntary exercise can have, providing the depletions of plasma glutamine are not excessively prolonged.
I used to think, as others with an interest in sports nutrition have thought, that glutamate or alpha-ketoglutarate would be superior to glutamine for various purposes, given that glutamine produces more ammonia than glutamate or alpha-ketoglutarate do (alpha-ketoglutarate contains no nitrogen). In my opinion, based on various articles and arguments I've seen, this is probably not true. In one article I have, the author makes a convincing case that glutamine is superior, by virtue of its ability to "spare" glutamate that would otherwise be used for glutamine biosynthesis (an ATP-consuming process). That, besides the replenishment of TCA cycle intermediates, could account for some of the cell-energy-maintaining effects of exogenous glutamine.
The conversion of glutamine into aspartate or alpha-ketoglutarate (2-oxoglutarate) could also play a role in the supposed ATP-buffering effects of exogenous glutamine. Another "layer" to anaplerosis is that an excess of acetyl-CoA or acyl-CoA's can, by sequestering free CoA, inhibit the alpha-ketoglutarate dehydrogenase (KGDH) reaction, and true anaplerosis, under those conditions, requires that substrates enter the TCA cycle at a point that bypasses the "rate-limited" KGDH step (http://hardcorephysiologyfun.blogspot.com/2009/01/coenzyme-sequestration.html). The changes in the ratios of acetyl-CoA to free CoA or to acyl-CoA's, other than acetyl-CoA, that occur, in myocytes or other cell types, during exercise depend on the intensity and duration of the exercise. Another effect of the glutamine-induced increases in the aspartate pools is that aspartate can help maintain the purine nucleotide cycle in the skeletal muscles during exercise. That's part of the rationale, in my opinion, for thinking about purines in conjunction with glutamine, given that purines can be depleted, to varying degrees, during exercise, from the muscles (and also other tissues, such as the liver and probably, in my view, the brain). There's a "purine nucleotide cycle" for salvaging purines in the brain, too, but it's not clear that it's necessarily as closely connected to the TCA cycle as the purine nucleotide cycle in the skeletal muscle is. The assumption in some articles is that the increase in the uptake of ammonia into the brain, during exercise, is partially responsible for central fatigue. That may be true to some extent, but ammonia has also been shown to increase growth hormone release, for example. There might be some "beneficial" or trophic effects, even if a slight increase in ISF ammonia limits the duration of exercise or produces some fatigue or cell-energy stress in the short term. But my point was that, even with the extra ammonia that glutamine could conceivably provide, glutamine supplementation has been shown to reduce ammonia accumulation, under some conditions. Low doses of arginine (6 grams or less, sometimes 9) are also typically combined with glutamine, given that arginine can increase ammonia disposal in the liver and conceivably antagonize the glutamine-induced increases in plasma bicarbonate, etc.
Morris (2004) [Morris, 2004: (http://jn.nutrition.org/cgi/reprint/134/10/2743S) (http://www.ncbi.nlm.nih.gov/pubmed/15465778?dopt=Abstract)] noted that arginine biosynthesis consumes 2 moles of ATP per mole of citrulline, and this could suggest, in my opinion, that exogenous arginine would be preferable to citrulline. Exogenous glutamine can increase plasma citrulline, which is then converted into arginine in the kidneys. This is the main site at which blood-borne arginine is synthesized. It's not possible to overcome the requirement for renal arginine biosynthesis, though. When people compare arginine and citrulline, the "neutral" influence of citrulline on urea production (given that citrulline has one fewer amine groups than arginine does) is viewed as being advantageous. The argument of Morris (2004), namely that arginine formation consumes ATP, suggests, in my opinion, that arginine would be a better choice. It's true that arginine tends to be taken up by the liver en masse, but the elevations in plasma arginine, in response to exogenous arginine, require at least a couple of weeks, I think, or longer, to emerge. So the short-term elevations of plasma arginine, in response to citrulline, Even though ornithine might appear to be favorable in its capacity to "spare nitrogen," by reducing ammonia levels, and to not produce the dose-limiting side effects that arginine produces (in doses of arginine in excess of 9 g/dose or even 9 g/d), that argument (the idea that ammonia scavenging or less ureagenesis is the be-all and end-all of these types of approaches, to buffering plasma glutamine and TCA cycle intermediates in the skeletal muscles or brain or elsewhere) has carried too much sway and is not really valid, in my opinion, under normal conditions (i.e. in the absence of hyperammonemia, such as in the context of hepatic encephalopathy, induced by hepatic failure or renal failure, etc.). Additionally, ornithine can suppress creatine biosynthesis, but increases in plasma citrulline or arginine have the potential to increase creatine biosynthesis (see link below). That distinction, between arginine and ornithine, is likely, in my opinion, to be especially important in the context of in situ (it's both in situ and de novo) creatine biosynthesis in the brain [(http://hardcorephysiologyfun.blogspot.com/2009/02/glutamate-glutamine-cycle-de-novo.html); (http://hardcorephysiologyfun.blogspot.com/2009/01/pyridoxine-folate-cobalamin.html)].
Saturday, February 14, 2009
Use of Glutamine in Chemotherapy-Induced Neuropathy; GABAergic Symptoms and Glutamate Depletion in Pyridoxine-Induced Neuropathy?
The authors of this article [Wang et al., 2007: (http://theoncologist.com/cgi/content/full/12/3/312) (http://www.ncbi.nlm.nih.gov/pubmed/17405895)] used 30 grams/d of L-glutamine to help prevent chemotherapy-induced peripheral neuropathy. There's a whole series of articles on this, and it's interesting. I don't know what the mechanism would be, but glutamine might just increase the pool of tricarboxylic acid (TCA) cycle intermediates in the neurons. This reminds me of that article showing that glutamic acid could prevent peripheral neuropathy induced by an excess of vitamin B6 (pyridoxine) (http://hardcorephysiologyfun.blogspot.com/2009/01/clues-to-mechanisms-of-excessive.html), and I'm thinking that the peripheral neuropathy that occurs in response to high doses of vitamin B6 might well be metabolic. The authors [Arkaravichien et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/12909271)] of that article, discussing the prevention of pyridoxine-induced peripheral neuropathy with glutamic acid, attributed the prevention to glutamatergic mechanisms (i.e. an increase in glutamate storage and release), and that could be part of the mechanism. The activity of glutamic acid decarboxylase (GAD, an enzyme that converts glutamate to GABA) is thought to be very sensitive to pyridoxine intake (PLP availability) and to increase more than the activity of GABA transaminase [an enzyme that converts GABA into succinate semialdehyde and is another vitamin B6-derived-cofactor (PLP)-dependent enzyme] in response to an increase in PLP availability. As the dosage of pyridoxine is increased into the "peripheral-neuropathy-inducing" range of long-term dosages (125-150 or more mg/d), one would expect to see more GABAergic effects (sedation, slowing of mental function, depression, etc.).
PLP is a cofactor for over 100 enzymes, though. But GAD activity is known to increase more than other PLP-dependent enzymes' activities do, as the dosage of vitamin B6 is increased. The increases in GAD activity are evidently a result of the loose binding of PLP (in comparison to the binding of PLP to GABA transaminase and other PLP-dependent enzymes) to one of the isoforms of GAD that's expressed in the brain [Martin et al., 1991: (http://www.ncbi.nlm.nih.gov/pubmed/1685767)]. Increases in the transaminase enzymes that are PLP-dependent would produce really complicated effects on aspartate and glutamate [and on the activities of TCA cycle enzymes, by maintaining the flux of substrates through the enzymes whose activities sustain the malate-aspartate shuttle: (http://hardcorephysiologyfun.blogspot.com/2009/01/mechanisms-of-neuroprotection-by.html)], for example, and those changes could interact with the increases in GAD activity and conceivably lead to peripheral neuropathy. But PLP is also a cofactor for glycogen phosphorylase, and increases in the vitamin B6 intake tend to increase glycogenolysis. Vitamin B6 is used in McArdle's disease (a glycogen storage disease) to promote glycogenolysis, and that effect of vitamin B6 could be a double-edged sword and interfere with blood sugar regulation or even lead, in concert with other PLP-dependent changes, to astrocyte glycogen depletion (and produce some degree of peripheral neuropathy by increasing the frequencies of efferent discharges from primary afferent neurons, with the action potentials or inflammatory changes originating at the central terminals of the primary afferent neurons, etc.). PLP bound to glycogen phosphorylase in skeletal muscles turns over really slowly and is tightly-bound, but I think that's not necessarily relevant to the issue of peripheral neuropathy (sensitivity to pyridoxine status doesn't correlate inversely with the tightness of bound PLP to all PLP-dependent enzymes, just with GAD). Peripheral neuropathy due to an excess of pyridoxine tends to develop gradually, but I'm just saying that there's not going to be a magic, cut-off dosage that won't produce any neuropathic changes but will produce purely "therapeutic" or desirable effects. The mechanisms leading to neuropathy are, in my opinion, going to be present at a low level at lower dosages. In any case, I think that some of those GABAergic effects of high dosages of vitamin B6 should be viewed as being, in my opinion, "pre-neuropathic."
PLP is a cofactor for over 100 enzymes, though. But GAD activity is known to increase more than other PLP-dependent enzymes' activities do, as the dosage of vitamin B6 is increased. The increases in GAD activity are evidently a result of the loose binding of PLP (in comparison to the binding of PLP to GABA transaminase and other PLP-dependent enzymes) to one of the isoforms of GAD that's expressed in the brain [Martin et al., 1991: (http://www.ncbi.nlm.nih.gov/pubmed/1685767)]. Increases in the transaminase enzymes that are PLP-dependent would produce really complicated effects on aspartate and glutamate [and on the activities of TCA cycle enzymes, by maintaining the flux of substrates through the enzymes whose activities sustain the malate-aspartate shuttle: (http://hardcorephysiologyfun.blogspot.com/2009/01/mechanisms-of-neuroprotection-by.html)], for example, and those changes could interact with the increases in GAD activity and conceivably lead to peripheral neuropathy. But PLP is also a cofactor for glycogen phosphorylase, and increases in the vitamin B6 intake tend to increase glycogenolysis. Vitamin B6 is used in McArdle's disease (a glycogen storage disease) to promote glycogenolysis, and that effect of vitamin B6 could be a double-edged sword and interfere with blood sugar regulation or even lead, in concert with other PLP-dependent changes, to astrocyte glycogen depletion (and produce some degree of peripheral neuropathy by increasing the frequencies of efferent discharges from primary afferent neurons, with the action potentials or inflammatory changes originating at the central terminals of the primary afferent neurons, etc.). PLP bound to glycogen phosphorylase in skeletal muscles turns over really slowly and is tightly-bound, but I think that's not necessarily relevant to the issue of peripheral neuropathy (sensitivity to pyridoxine status doesn't correlate inversely with the tightness of bound PLP to all PLP-dependent enzymes, just with GAD). Peripheral neuropathy due to an excess of pyridoxine tends to develop gradually, but I'm just saying that there's not going to be a magic, cut-off dosage that won't produce any neuropathic changes but will produce purely "therapeutic" or desirable effects. The mechanisms leading to neuropathy are, in my opinion, going to be present at a low level at lower dosages. In any case, I think that some of those GABAergic effects of high dosages of vitamin B6 should be viewed as being, in my opinion, "pre-neuropathic."
Sunday, February 8, 2009
Pyruvate and Ethyl Pyruvate in Neuroprotection and Other Contexts
The research on the therapeutic use of pyruvate is actually voluminous, but the articles are sort of hidden in plain sight (one has to search for "ethyl pyruvate," which is essentially a pyruvate prodrug, or for "sodium pyruvate" or "calcium pyruvate," which are the commonly-used salts of pyruvate: http://scholar.google.com/scholar?as_q=&num=100&btnG=Search+Scholar&as_epq=&as_oq=++sodium-pyruvate+calcium-pyruvate+ethyl-pyruvate&as_eq=hydrogenation+enantioselective&as_occt=any&as_sauthors=&as_publication=&as_ylo=&as_yhi=&as_allsubj=all&hl=en&lr=).
The end of that url should include an "=" sign, and I don't know if the link will work without it. But ethyl pyruvate is just the ethyl ester of pyruvate and is more stable in solution than pyruvate itself. The ethyl group is just cleaved by nonspecific esterase enzymes, as far as I understand it, evidently either in the gastrointestinal tract or in cells in the target tissues. A lot of the research is being done on the uses of ethyl pyruvate in acute inflammatory states, apparently, or situations of acute trauma to organs. But that doesn't mean pyruvate is just an "anti-inflammatory" drug, etc. Here are some articles evaluating the neuroprotective effects of pyruvate (from ethyl pyruvate):
(http://scholar.google.com/scholar?num=100&hl=en&q=ischemia+sodium-pyruvate+OR+calcium-pyruvate+OR+ethyl-pyruvate&spell=1). In my opinion, something like pyruvate (from ethyl pyruvate, which could be used at higher dosages without overloading the body with calcium or sodium, as Brunengraber and Roe (2006) (http://hardcorephysiologyfun.blogspot.com/2009/02/anaplerotic-compounds-free-form-glycine.html) noted.
One thing that is worth noting is that pyruvate, from calcium pyruvate, for example, can degrade to parapyruvate in a water solution. The authors of some of the articles about ethyl pyruvate mention that, and parapyruvate can evidently cause problems. One would probably not, in my opinion, want to "store" pyruvate in solution for a long time or make some sports drink with pyruvate in it. I say that because some people who are serious athletes do that, and some coaches or other people suggest that people make these dilute solutions of sugar in water and drink them during exercise, etc. (like home-made sports drinks with low concentrations of sugars). But I looked at the articles showing the lability of pyruvate, and they're almost unintelligible. It is the case that pyruvate is volatile in solution, but I don't know how long it would take to degrade. Those old articles, from the 1960s...I just can't tell what time frame they're talking about, but the use of sodium pyruvate in mitochondrial disorders [Tanaka et al., 2007: (http://www.orifund.org/doc/treatment.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/17881297)] would suggest, in my opinion, that pyruvate is stable enough to survive the few minutes, in solution, before its absorption. My guess is that the degradation would only become significant after many minutes to hours, but that's just my guess. Haas et al. (2007) did note that testing the blood for pyruvate requires a blood sample to be treated with 8 percent perchlorate and put in an ice bath and then tested "rapidly" for the concentration of pyruvate that's in the sample [Haas et al., 2007: (http://pediatrics.aappublications.org/cgi/reprint/120/6/1326.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/18055683)].
That article by Tanaka et al. (2007) is actually really good. It's a short communication but discusses some of the mechanisms, such as an increase in the NAD+/NADH ratio in the liver (presumably intramitochondrially), in response to exogenous pyruvate. The authors also note that surprisingly low doses of pyruvate have benefited some people with mitochondrial disorders, and the authors are evidently planning a multicenter trial to test sodium pyruvate in the treatment of people who have mitochondrial disorders.
Even though pyruvate looks useful, there could be some kind of theoretical concern, in my opinion, with elevation of plasma or intracellular alanine levels, in the sense that exogenous alanine was found to decrease creatine biosynthesis in humans [Crim et al., 1976: (http://jn.nutrition.org/cgi/reprint/106/3/371.pdf)]. Haas et al. (2007) noted that elevations in plasma alanine can be an indication that pyruvate levels have been elevated for a prolonged period of time [even in the context of an elevated lactate/pyruvate ratio that occurs in the context of an elevation in the absolute lactate and pyruvate levels (i.e. both [lactate+pyruvate] and [lactate]/[pyruvate] are elevated over controls, etc.)]. Pyruvate can be transaminated to alanine, as far as I know, but I don't know if alanine would actually accumulate during the conditions under which pyruvate would be administered therapeutically. Fritsche et al. (1999) [Fritsche et al., 1999: (http://www.jbc.org/cgi/reprint/274/5/3026) (http://www.ncbi.nlm.nih.gov/pubmed/9915841?dopt=Abstract)] showed that L-alanine can inhibit arginine:glycine amidinotransferase (the first enzyme of creatine biosynthesis), but I doubt that effect would occur in the short term. That's just my opinion, and there would probably be some way of preventing it. I'm not sure that alanine would even be elevated in people taking pyruvate, but, in my opinion, it's conceivable that alanine could become elevated in response to very large doses of pyruvate or pyruvate prodrugs. It's the only theoretical concern I can think of, but I'm not sure it would even occur.
The end of that url should include an "=" sign, and I don't know if the link will work without it. But ethyl pyruvate is just the ethyl ester of pyruvate and is more stable in solution than pyruvate itself. The ethyl group is just cleaved by nonspecific esterase enzymes, as far as I understand it, evidently either in the gastrointestinal tract or in cells in the target tissues. A lot of the research is being done on the uses of ethyl pyruvate in acute inflammatory states, apparently, or situations of acute trauma to organs. But that doesn't mean pyruvate is just an "anti-inflammatory" drug, etc. Here are some articles evaluating the neuroprotective effects of pyruvate (from ethyl pyruvate):
(http://scholar.google.com/scholar?num=100&hl=en&q=ischemia+sodium-pyruvate+OR+calcium-pyruvate+OR+ethyl-pyruvate&spell=1). In my opinion, something like pyruvate (from ethyl pyruvate, which could be used at higher dosages without overloading the body with calcium or sodium, as Brunengraber and Roe (2006) (http://hardcorephysiologyfun.blogspot.com/2009/02/anaplerotic-compounds-free-form-glycine.html) noted.
One thing that is worth noting is that pyruvate, from calcium pyruvate, for example, can degrade to parapyruvate in a water solution. The authors of some of the articles about ethyl pyruvate mention that, and parapyruvate can evidently cause problems. One would probably not, in my opinion, want to "store" pyruvate in solution for a long time or make some sports drink with pyruvate in it. I say that because some people who are serious athletes do that, and some coaches or other people suggest that people make these dilute solutions of sugar in water and drink them during exercise, etc. (like home-made sports drinks with low concentrations of sugars). But I looked at the articles showing the lability of pyruvate, and they're almost unintelligible. It is the case that pyruvate is volatile in solution, but I don't know how long it would take to degrade. Those old articles, from the 1960s...I just can't tell what time frame they're talking about, but the use of sodium pyruvate in mitochondrial disorders [Tanaka et al., 2007: (http://www.orifund.org/doc/treatment.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/17881297)] would suggest, in my opinion, that pyruvate is stable enough to survive the few minutes, in solution, before its absorption. My guess is that the degradation would only become significant after many minutes to hours, but that's just my guess. Haas et al. (2007) did note that testing the blood for pyruvate requires a blood sample to be treated with 8 percent perchlorate and put in an ice bath and then tested "rapidly" for the concentration of pyruvate that's in the sample [Haas et al., 2007: (http://pediatrics.aappublications.org/cgi/reprint/120/6/1326.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/18055683)].
That article by Tanaka et al. (2007) is actually really good. It's a short communication but discusses some of the mechanisms, such as an increase in the NAD+/NADH ratio in the liver (presumably intramitochondrially), in response to exogenous pyruvate. The authors also note that surprisingly low doses of pyruvate have benefited some people with mitochondrial disorders, and the authors are evidently planning a multicenter trial to test sodium pyruvate in the treatment of people who have mitochondrial disorders.
Even though pyruvate looks useful, there could be some kind of theoretical concern, in my opinion, with elevation of plasma or intracellular alanine levels, in the sense that exogenous alanine was found to decrease creatine biosynthesis in humans [Crim et al., 1976: (http://jn.nutrition.org/cgi/reprint/106/3/371.pdf)]. Haas et al. (2007) noted that elevations in plasma alanine can be an indication that pyruvate levels have been elevated for a prolonged period of time [even in the context of an elevated lactate/pyruvate ratio that occurs in the context of an elevation in the absolute lactate and pyruvate levels (i.e. both [lactate+pyruvate] and [lactate]/[pyruvate] are elevated over controls, etc.)]. Pyruvate can be transaminated to alanine, as far as I know, but I don't know if alanine would actually accumulate during the conditions under which pyruvate would be administered therapeutically. Fritsche et al. (1999) [Fritsche et al., 1999: (http://www.jbc.org/cgi/reprint/274/5/3026) (http://www.ncbi.nlm.nih.gov/pubmed/9915841?dopt=Abstract)] showed that L-alanine can inhibit arginine:glycine amidinotransferase (the first enzyme of creatine biosynthesis), but I doubt that effect would occur in the short term. That's just my opinion, and there would probably be some way of preventing it. I'm not sure that alanine would even be elevated in people taking pyruvate, but, in my opinion, it's conceivable that alanine could become elevated in response to very large doses of pyruvate or pyruvate prodrugs. It's the only theoretical concern I can think of, but I'm not sure it would even occur.
Saturday, February 7, 2009
Brief Note on Purines
I see that some oral purine formulations are, in fact, not enteric-coated (in reference to this, for example: http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html), and here are some examples [(http://www.google.com/products?q=adenosine+triphosphate+disodium+dissolve+OR+melt&hl=en&show=dd)]. That search didn't turn up all of the suppliers of the individual products, and doing a search on an individual product gives a wider range of suppliers/vendors for the specific product. Another product contains guanosine, also, and I referred to a link, in relation to that, in my last posting. I have no financial interest in any company or in any of this type of thing. Don't shoot the messenger. In the absence of some sort of systematic analysis or attention, these types of things have the potential to remain locked-away in obscurity or shrouded in confusion and mystery for years upon years.
Oral purines have, in my opinion, the potential to lower the peripheral vascular resistance (produce vasodilation, essentially) and conceivably decrease platelet aggregation, and there could be some extra need for caution in people who have low blood pressure or blood vessel diseases or who are taking antihypertensive medications for high blood pressure (or in people who are taking drugs to decrease platelet aggregation, such as aspirin, for example). A person should always check with his or her doctor before taking any supplement.
Oral purines have, in my opinion, the potential to lower the peripheral vascular resistance (produce vasodilation, essentially) and conceivably decrease platelet aggregation, and there could be some extra need for caution in people who have low blood pressure or blood vessel diseases or who are taking antihypertensive medications for high blood pressure (or in people who are taking drugs to decrease platelet aggregation, such as aspirin, for example). A person should always check with his or her doctor before taking any supplement.
The Glutamate-Glutamine Cycle, De Novo Creatine Biosynthesis, and Nucleotides in Relation to Energy Metabolism in the Brain
This article [Brunengraber and Roe, 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16763895)] discusses the anaplerotic effects of pyruvate and of odd-chain fatty acids or related compounds (such as propionyl-L-carnitine, which is a "precursor" of propionyl-CoA, an odd-chain acyl-CoA), but the authors also mention that glutamine or glutamate can be anaplerotic and protect against the damaging effects of postischemic reperfusion (the restoration of blood flow that follows a reduction in blood flow) on the heart or other tissues. Researchers have done a lot of research on the glutamate-glutamine cycle in the brain, and it's known to be really important for cellular energy metabolism [Newsholme et al., 2003: (http://www.erin.utoronto.ca/~w3bio452/452%20Papers%202004/GlutamateRev%20CellBiochemFunc03.pdf) (http://www.ncbi.nlm.nih.gov/pubmed/12579515)]. I think the research on glutamine metabolism in the brain, in relation to the tricarboxylic acid (TCA) cycle activity and glucose utilization in general, hasn't really sunken in. The vast majority of research focuses on the damaging effects or neurophysiological effects of extracellular glutamate, which is obviously of central importance to neurotransmission in the brain and to most aspects of brain functioning. But there seems to be an assumption that, in the context of disease states that produce cognitive impairment or psychiatric symptoms, the cellular energy-related side of the glutamate-glutamine cycle is either undisturbed or is disturbed only with relevance to glutamatergic neurotransmission per se. In my opinion, this is not a valid assumption. I think there's a tendency to think in terms of either-or's, to assume that the absence of dramatic, cerebral edema produced by hyperammonemia is an indication that the glutamine-glutamate cycle is functioning with pristine efficiency. I've tended to think that way, but, in view of things I've read, I'm starting to doubt that that's the case. There's a vast amount of research on the use of glutamine or alpha-ketoglutarate, usually in conjunction with arginine (or ornithine), to support protein synthesis in the context of catabolic disease states, such as cachexia due to cancer or viral illnesses or burn injuries. There are too many articles to cite on that topic, but I think that there could be some potential for manipulating, in the context of some disease states, the glutamine-glutamate cycle in the brain. That's just my opinion, and there's research using nucleotides in combination with arginine and glutamine or alpha-ketoglutarate to support, in just about every organ system or tissue other than the brain, recovery from cellular injuries. I'm not sure why this would be any different for the brain, and glutamine and alpha-ketoglutarate have not been shown to produce effects that are similar to monosodium glutamate. That's just my opinion, however, and there would obviously be potential for harm in people with specific brain diseases or liver or kidney diseases. A person should obviously talk with his or her doctor before taking any supplement.
There's a lot of research showing that arginine depletion produces fatty liver by, in part, disrupting pyrimidine biosynthesis and causing orotate accumulation, and the relevance of this goes beyond the issues related to liver disease. I don't have time to go into this in depth now, but, in my opinion, something like arginine alpha-ketoglutarate could be useful in, for example, allowing a person to begin an exercise program, to get it off the ground, etc. I'm not going to refer to any specific disease states, but, in any case, physical exercise puts demands on not only muscle cell energy metabolism but brain energy metabolism [Dalsgaard et al., 2002: (http://jp.physoc.org/cgi/content/full/540/2/681) (http://www.ncbi.nlm.nih.gov/pubmed/11956354?dopt=Abstract)]. Exercise increases the transport of lactate into the brain significantly, and even the "intent to exercise," to perform intentional motor movements, influences brain cell energy metabolism (Dalsgaard et al., 2002). This could be related to the anticipatory increase in the firing rates of noradrenergic neurons in the locus ceruleus (LC), for example, in much the same way as thinking about stressful situations can activate the LC-mediated, noradrenergic "stress response" system. There's a large amount of research showing this type of metabolic stress that is imposed by exercise on the brain, but, for some reason, there seems to be resistance to the idea that these are relevant under normal circumstances. I mean that researchers keep seeming to want, in my opinion, to say that there's no reason for people to be alarmed and that people will always get beneficial effects from exercise. But the fact remains that many people do not sustain even basic exercise programs, and one reason for this, in my opinion, is that these types of mechanisms, involving the brain, come into play.
I was going to put this general information up about arginine vs. ornithine. A lot of the authors of articles on ornithine alpha-ketoglutarate do not seem to realize the potential for detrimental effects that, in my opinion, can result from mass-action effects of exogenous ornithine on arginine:glycine amidinotransferase (AGAT), the first enzyme in creatine biosynthesis. This enzyme catalyzes the reversible conversion of arginine and glycine into guanidinoacetate and ornithine. Exogenous arginine+glycine have been shown to increase creatine biosynthesis in humans [Crim et al., 1976: (http://jn.nutrition.org/cgi/reprint/106/3/371.pdf)], and the combination of exogenous ornithine, limitation of dietary arginine, and exogenous creatine can suppress AGAT activity in people with guanidinoacetate N-methyltransferase deficiency (a genetic disorder) [Stromberger et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/12889668)]. The key point is that an increase in the ratio of dietary ornithine to arginine can decrease guanidinoacetate formation and thereby reduce creatine biosynthesis. Taking exogenous glycine is not, in my opinion, a good idea, but Arias et al. (2004) [Arias et al., 2004: (http://www.ncbi.nlm.nih.gov/pubmed/15234335)] noted that excessive ornithine accumulation, in people with impairments in the activities of urea cycle enzymes, can suppress creatine biosynthesis by producing a reversal of AGAT activity. Arias et al. (2004) also noted that exogenous arginine can enter the brain much more easily than creatine can, and creatine is known, as Arias et al. (2004) discussed, to have neuroprotective and even trophic effects on the brain. If AGAT activity were suppressed outside the brain by ornithine excesses, the contribution of peripheral guanidinoacetate to the brain's creatine pool could, in my opinion, be compromised. More importantly, an excessive accumulation of ornithine in the blood could, in my opinion, enter the brain and suppress de novo guanidinoacetate production by AGAT activity that occurs within the brain (the brain makes much of its own creatine de novo).
As I've said in past postings, arginine, in my opinion, is safer than ornithine, and I don't personally think exogenous creatine supplementation is a good idea. Arginine and nucleotides could conceivably, along with reduced folates and vitamin B12 (which could be expected to disinhibit GAMT activity and increase de novo creatine biosynthesis in the brain, by increasing the ratio of SAM-e to S-adenosylhomocysteine, while simultaneously limiting the accumulation of neurotoxic guanidinoacetate), help to maintain creatine levels in the brain and would not, in my opinion, pose the same kinds of risks that exogenous creatine could conceivably pose, under certain circumstances. Exogenous nucleotides have been shown to help maintain the phosphocreatine to inorganic phosphate ratio during ischemic conditions [Iwasa et al., 2000: (http://www.ncbi.nlm.nih.gov/pubmed/10906568)], and Silveri et al. (2003) [Silveri et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/14550683)] found that oral SAM-e increased phosphocreatine levels in the brains of people and decreased beta-nucleoside triphosphate levels. One interpretation of that article is that SAM-e increased the overall pool of purines and, in particular, adenosine nucleotides but did not influence the overall adenylate charge, as implied by the authors, and the elevation in phosphocreatine levels, in my opinion, could be due to an elevation by the SAM-e of the overall ADP+AMP+ATP pool, in spite of an apparent decrease in the absolute nucleotide triphosphate levels. I say this because Renshaw et al. (2001) [Renshaw et al., 2001: (http://ajp.psychiatryonline.org/cgi/content/full/158/12/2048) (http://www.ncbi.nlm.nih.gov/pubmed/11729024)] found decreases in the levels of purines in some people with depression and proposed that the antidepressant effects of SAM-e may be due to the elevation in adenosine (and, in my opinion, the overall purine pool) derived from S-adenosylhomocysteine hydrolase activity. I discussed this in relation to the likelihood, in my opinion, that exogenous nucleotides are likely to have superior bioavailability to SAM-e (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html). Exogenous creatine has been researched in the treatment of unipolar, treatment resistant depression, [Roitman et al., 2007: (http://www.ncbi.nlm.nih.gov/pubmed/17988366)] but, in my opinion, exogenous creatine poses risks on a number of levels. That said, that type of approach involving the use of arginine alpha-ketoglutarate, nucleotides, and strategies aimed at disinhibiting GAMT activity could, in my opinion, have some sort of potential to maintain creatine levels in the brain and be applicable to conditions that are characterized by disruptions in brain cell energy metabolism. But that's just my opinion.
There's a lot of research showing that arginine depletion produces fatty liver by, in part, disrupting pyrimidine biosynthesis and causing orotate accumulation, and the relevance of this goes beyond the issues related to liver disease. I don't have time to go into this in depth now, but, in my opinion, something like arginine alpha-ketoglutarate could be useful in, for example, allowing a person to begin an exercise program, to get it off the ground, etc. I'm not going to refer to any specific disease states, but, in any case, physical exercise puts demands on not only muscle cell energy metabolism but brain energy metabolism [Dalsgaard et al., 2002: (http://jp.physoc.org/cgi/content/full/540/2/681) (http://www.ncbi.nlm.nih.gov/pubmed/11956354?dopt=Abstract)]. Exercise increases the transport of lactate into the brain significantly, and even the "intent to exercise," to perform intentional motor movements, influences brain cell energy metabolism (Dalsgaard et al., 2002). This could be related to the anticipatory increase in the firing rates of noradrenergic neurons in the locus ceruleus (LC), for example, in much the same way as thinking about stressful situations can activate the LC-mediated, noradrenergic "stress response" system. There's a large amount of research showing this type of metabolic stress that is imposed by exercise on the brain, but, for some reason, there seems to be resistance to the idea that these are relevant under normal circumstances. I mean that researchers keep seeming to want, in my opinion, to say that there's no reason for people to be alarmed and that people will always get beneficial effects from exercise. But the fact remains that many people do not sustain even basic exercise programs, and one reason for this, in my opinion, is that these types of mechanisms, involving the brain, come into play.
I was going to put this general information up about arginine vs. ornithine. A lot of the authors of articles on ornithine alpha-ketoglutarate do not seem to realize the potential for detrimental effects that, in my opinion, can result from mass-action effects of exogenous ornithine on arginine:glycine amidinotransferase (AGAT), the first enzyme in creatine biosynthesis. This enzyme catalyzes the reversible conversion of arginine and glycine into guanidinoacetate and ornithine. Exogenous arginine+glycine have been shown to increase creatine biosynthesis in humans [Crim et al., 1976: (http://jn.nutrition.org/cgi/reprint/106/3/371.pdf)], and the combination of exogenous ornithine, limitation of dietary arginine, and exogenous creatine can suppress AGAT activity in people with guanidinoacetate N-methyltransferase deficiency (a genetic disorder) [Stromberger et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/12889668)]. The key point is that an increase in the ratio of dietary ornithine to arginine can decrease guanidinoacetate formation and thereby reduce creatine biosynthesis. Taking exogenous glycine is not, in my opinion, a good idea, but Arias et al. (2004) [Arias et al., 2004: (http://www.ncbi.nlm.nih.gov/pubmed/15234335)] noted that excessive ornithine accumulation, in people with impairments in the activities of urea cycle enzymes, can suppress creatine biosynthesis by producing a reversal of AGAT activity. Arias et al. (2004) also noted that exogenous arginine can enter the brain much more easily than creatine can, and creatine is known, as Arias et al. (2004) discussed, to have neuroprotective and even trophic effects on the brain. If AGAT activity were suppressed outside the brain by ornithine excesses, the contribution of peripheral guanidinoacetate to the brain's creatine pool could, in my opinion, be compromised. More importantly, an excessive accumulation of ornithine in the blood could, in my opinion, enter the brain and suppress de novo guanidinoacetate production by AGAT activity that occurs within the brain (the brain makes much of its own creatine de novo).
As I've said in past postings, arginine, in my opinion, is safer than ornithine, and I don't personally think exogenous creatine supplementation is a good idea. Arginine and nucleotides could conceivably, along with reduced folates and vitamin B12 (which could be expected to disinhibit GAMT activity and increase de novo creatine biosynthesis in the brain, by increasing the ratio of SAM-e to S-adenosylhomocysteine, while simultaneously limiting the accumulation of neurotoxic guanidinoacetate), help to maintain creatine levels in the brain and would not, in my opinion, pose the same kinds of risks that exogenous creatine could conceivably pose, under certain circumstances. Exogenous nucleotides have been shown to help maintain the phosphocreatine to inorganic phosphate ratio during ischemic conditions [Iwasa et al., 2000: (http://www.ncbi.nlm.nih.gov/pubmed/10906568)], and Silveri et al. (2003) [Silveri et al., 2003: (http://www.ncbi.nlm.nih.gov/pubmed/14550683)] found that oral SAM-e increased phosphocreatine levels in the brains of people and decreased beta-nucleoside triphosphate levels. One interpretation of that article is that SAM-e increased the overall pool of purines and, in particular, adenosine nucleotides but did not influence the overall adenylate charge, as implied by the authors, and the elevation in phosphocreatine levels, in my opinion, could be due to an elevation by the SAM-e of the overall ADP+AMP+ATP pool, in spite of an apparent decrease in the absolute nucleotide triphosphate levels. I say this because Renshaw et al. (2001) [Renshaw et al., 2001: (http://ajp.psychiatryonline.org/cgi/content/full/158/12/2048) (http://www.ncbi.nlm.nih.gov/pubmed/11729024)] found decreases in the levels of purines in some people with depression and proposed that the antidepressant effects of SAM-e may be due to the elevation in adenosine (and, in my opinion, the overall purine pool) derived from S-adenosylhomocysteine hydrolase activity. I discussed this in relation to the likelihood, in my opinion, that exogenous nucleotides are likely to have superior bioavailability to SAM-e (http://hardcorephysiologyfun.blogspot.com/2009/01/details-on-nucleotides-bioavailability.html). Exogenous creatine has been researched in the treatment of unipolar, treatment resistant depression, [Roitman et al., 2007: (http://www.ncbi.nlm.nih.gov/pubmed/17988366)] but, in my opinion, exogenous creatine poses risks on a number of levels. That said, that type of approach involving the use of arginine alpha-ketoglutarate, nucleotides, and strategies aimed at disinhibiting GAMT activity could, in my opinion, have some sort of potential to maintain creatine levels in the brain and be applicable to conditions that are characterized by disruptions in brain cell energy metabolism. But that's just my opinion.
Friday, February 6, 2009
Ammonia and Cerebral Energy Metabolism
This is an interesting article on hyperammonemia and the dysregulation of the brain's "glutamate-glutamine" cycle [Vaquero and Butterworth, 2006: (http://www.ncbi.nlm.nih.gov/pubmed/16771837)]. This article focuses on hepatic encephalopathy and hepatic failure (liver failure), but there's reason to think this type of problem with cell energy metabolism, produced by either elevations in ammonia or by other changes, could occur in other disease states. One key point that the authors mention is that ammonia inhibits the alpha-ketoglutarate dehydrogenase complex, a thiamine-derived-cofactor-dependent multienzyme complex that plays a major role in the tricarboxylic acid (TCA) cycle but can be inhibited by, as discussed in the article I cited above, ammonia, and by the many factors that can conspire to produce sequestration of free coenzyme A (such as an increase in the oxidation of ketone bodies) [Russell et al., 1995: (http://ajpheart.physiology.org/cgi/content/abstract/268/1/H441) (http://www.ncbi.nlm.nih.gov/pubmed/7840294?dopt=Abstract)]. Even though I mentioned one article comparing the neuroprotective effects of beta-hydroxybutyrate (a ketone) to propionyl-L-carnitine [Puchowicz et al., 2008, cited here: (http://hardcorephysiologyfun.blogspot.com/2009/01/propionyl-l-carnitine-beta.html)], the article by Russell et al. (1995) shows that propionylcarnitine can prevent the kind of inhibition of the flux of substrates through the TCA cycle that coenzyme A sequestration, resulting from excessive accumulation of acetyl-CoA or acetoacetyl-CoA, produces.
There's quite a bit of research showing that carnitine esters can sometimes act as "indirect ammonia scavengers," (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=carnitine+ammonia+CoA) by disinhibiting the urea cycle enzymes that can be inhibited by CoA thioesters (acyl-CoA's), etc. I don't know how potent they are, though. In my opinion, propionyl-L-carnitine and acetyl-L-carnitine are generally more useful for whatever purposes one is applying them to, and they generally tend to elevate the overall carnitine pool (as L-carnitine does). Alpha-keto acids [or just keto acids or ketoacids: (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=renal+%22keto+acids%22+OR+ketoacids)] have been used in people with renal failure, to act as an ammonia scavenger, as I understand it, at least in part. They tend to use branched-chain ketoacids such as ketoisocaproate and ketoisovalerate, I think it's called, and another one is the carbon skeleton of glutamine (alpha-ketoglutarate or 2-oxoglutarate). People involved in sports nutrition have tended to say that alpha-ketoglutarate, in something like arginine alpha-ketoglutarate, is more useful than glutamine because of the capacity of alpha-ketoglutarate to "scavenge" ammonia or at least not add to the ammonia burden. I don't have time to elaborate, and these topics are complex.
There's quite a bit of research showing that carnitine esters can sometimes act as "indirect ammonia scavengers," (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=carnitine+ammonia+CoA) by disinhibiting the urea cycle enzymes that can be inhibited by CoA thioesters (acyl-CoA's), etc. I don't know how potent they are, though. In my opinion, propionyl-L-carnitine and acetyl-L-carnitine are generally more useful for whatever purposes one is applying them to, and they generally tend to elevate the overall carnitine pool (as L-carnitine does). Alpha-keto acids [or just keto acids or ketoacids: (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=renal+%22keto+acids%22+OR+ketoacids)] have been used in people with renal failure, to act as an ammonia scavenger, as I understand it, at least in part. They tend to use branched-chain ketoacids such as ketoisocaproate and ketoisovalerate, I think it's called, and another one is the carbon skeleton of glutamine (alpha-ketoglutarate or 2-oxoglutarate). People involved in sports nutrition have tended to say that alpha-ketoglutarate, in something like arginine alpha-ketoglutarate, is more useful than glutamine because of the capacity of alpha-ketoglutarate to "scavenge" ammonia or at least not add to the ammonia burden. I don't have time to elaborate, and these topics are complex.
Wednesday, February 4, 2009
Absorption of Vitamin B12 by Passive Diffusion
This [Berlin et al., 1968: (http://www.ncbi.nlm.nih.gov/pubmed/5751528?dopt=Abstract)] is the original article, as far as I know, showing that about 1.22 percent (the authors also refer to a figure of 1.5 percent, from earlier research) of a dose of oral cyanocobalamin (a form of vitamin B12) is absorbed, apparently by passive diffusion, in an intrinsic-factor-independent manner. This 1.22 percent absorption occurs across a dosage range from 100 ug (0.1 mg) to 100,000 ug (100 mg), and the authors looked at the 24-hour urinary excretion of radiolabeled cyanocobalamin. I obviously don't think that taking doses that high would be wise, and I also don't think that taking high doses of cyanocobalamin is a good idea [because of the added cyanide burden from the cyanide that's liberated from part of a dose of cyanocobalamin, upon its intracellular metabolism (assuming it's not excreted in the urine first)]. In my opinion, cyanocobalamin is an inferior form of vitamin B12, and there's considerable evidence that its transport and retention and bioavailability are inferior to the transport, retention, and bioavailabilities of methylcobalamin or hydroxocobalamin (http://hardcorephysiologyfun.blogspot.com/2009/01/km-values-for-adocbl-binding-to-mmm-and.html).
Assuming the 1.22 percent absorption figure is valid, an oral dose of 17,564 ug/d (17.5 mg/d) of methylcobalamin would be required to very loosely mimic the effect of three-times-weekly, intramuscular injections of 500 ug methylcobalamin (used in a lot of studies) (1,500 ug/7 days = 214.3 ug/d; 214.3/0.122 = 17564). I doubt that one can really compare those two dosage forms, though, because presumably the bioavailability of intramuscular or parenteral methylcobalamin would be higher than the bioavailability from oral methylcobalamin. That seems like an excessively-high dose to me, and I don't know what doses are best. The main concern I would personally have would be with the high levels of free cobalamins that exist in the kidneys, evidently in endosomes or lysosomes. But the reabsorption is evidently saturable, and I've never seen any reports of nephrotoxicity (and many of the articles using the highest doses of i.m. methylcobalamin have administered the methylcobalamin to people with renal failure). A lot of articles have shown nephrotoxicity from methylmalonic acid, though, and vitamin B12 decreases methylmalonic acid levels. Anyway, one would obviously want to check with his or her doctor before taking any supplement.
Also, I was going to mention that the sublingual dosage form is, in my opinion (and the opinion of various authors in the literature), unnecessary. I don't think methylcobalamin is sold in anything other than sublingual (lozenge) dosage forms, however. I think, if I'm remembering correctly, that the transcobalamin I protein (I think it's just called haptocorrin) is present in saliva, and the idea is that methylcobalamin will bind to that and somehow be absorbed to a greater extent or something. But, in my opinion, this doesn't make sense, given that the absorption of dosages in excess of 2.2 ug occurs by passive diffusion and is not dependent on protein binding. One could argue that the binding of methylcobalamin to salivary haptocorrin(s) could in some way prevent its inactivation by sulfite or nitrites in the intestinal luminal fluid (?), but I've never heard anyone suggest that that could occur. I remember an exchange in the literature, in which the authors were trying to explain the reason (in their opinions) the sublingual dosage form wasn't necessary, and the people who were making a counter-response just...didn't understand. They were saying things such as: "there's no reason to think the people swallowed the vitamin B12 immediately," or something like that. I'm not quoting anyone, but, in my opinion, it's not necessary.
Assuming the 1.22 percent absorption figure is valid, an oral dose of 17,564 ug/d (17.5 mg/d) of methylcobalamin would be required to very loosely mimic the effect of three-times-weekly, intramuscular injections of 500 ug methylcobalamin (used in a lot of studies) (1,500 ug/7 days = 214.3 ug/d; 214.3/0.122 = 17564). I doubt that one can really compare those two dosage forms, though, because presumably the bioavailability of intramuscular or parenteral methylcobalamin would be higher than the bioavailability from oral methylcobalamin. That seems like an excessively-high dose to me, and I don't know what doses are best. The main concern I would personally have would be with the high levels of free cobalamins that exist in the kidneys, evidently in endosomes or lysosomes. But the reabsorption is evidently saturable, and I've never seen any reports of nephrotoxicity (and many of the articles using the highest doses of i.m. methylcobalamin have administered the methylcobalamin to people with renal failure). A lot of articles have shown nephrotoxicity from methylmalonic acid, though, and vitamin B12 decreases methylmalonic acid levels. Anyway, one would obviously want to check with his or her doctor before taking any supplement.
Also, I was going to mention that the sublingual dosage form is, in my opinion (and the opinion of various authors in the literature), unnecessary. I don't think methylcobalamin is sold in anything other than sublingual (lozenge) dosage forms, however. I think, if I'm remembering correctly, that the transcobalamin I protein (I think it's just called haptocorrin) is present in saliva, and the idea is that methylcobalamin will bind to that and somehow be absorbed to a greater extent or something. But, in my opinion, this doesn't make sense, given that the absorption of dosages in excess of 2.2 ug occurs by passive diffusion and is not dependent on protein binding. One could argue that the binding of methylcobalamin to salivary haptocorrin(s) could in some way prevent its inactivation by sulfite or nitrites in the intestinal luminal fluid (?), but I've never heard anyone suggest that that could occur. I remember an exchange in the literature, in which the authors were trying to explain the reason (in their opinions) the sublingual dosage form wasn't necessary, and the people who were making a counter-response just...didn't understand. They were saying things such as: "there's no reason to think the people swallowed the vitamin B12 immediately," or something like that. I'm not quoting anyone, but, in my opinion, it's not necessary.
Availability of Reduced Folates; Opinion on Intracellular Total Folates and Cancer Epidemiology and Cellular Biology
I saw that levoleucovorin, which is (6S)-5-formyltetrahydrofolate, tablets are probably going to become available (http://www.reuters.com/article/pressRelease/idUS106792+05-Jun-2008+BW20080605), and that's the type of thing that could conceivably be an alternative to (6S)-5-methyltetrahydrofolate (L-methylfolate). L-methylfolate is available by prescription, but I'm not sure what the status is with coverage by insurance carriers. L-methylfolate is also available over-the-counter under the "trade" name Metafolin. Again, I have no financial interest whatsoever in any of these products, but these types of details have the potential to become "locked away," etc. I don't know if insurance carriers would all immediately cover levoleucovorin, but it seems to me that the coverage might be more durable? Obviously, a person's doctor would have to prescribe it for an off-label use, in this case, but levoleucovorin is probably going to be approved for use in the treatment of methotrexate toxicity. I'm not saying the uses for L-methylfolate are any less serious than for the treatment of methotrexate toxicity, but not everyone would agree with me. Whatever way one wants to view it, the coverage of levoleucovorin might be "durable."
The significance of the potential approval of levoleucovorin is that L-leucovorin (a.k.a. L-folinic acid) is the natural diastereomer of 5-formyltetrahydrofolate, and the natural diastereomer has been shown to produce more potent effects than the mixture of natural and unnatural diastereomers have been shown to produce. In my opinion, L-leucovorin would be expected to have some safety-related advantages over racemic leucovorin (such as in people who have immunological disorders or who are predisposed to experiencing allergic reactions), even though many people obviously have taken racemic leucovorin safely. The leucovorin that is available now is half (6R)-leucovorin and half (6S)-leucovorin [it's "racemic leucovorin" and is also called D,L-leucovorin or D,L-folinic acid or (6R,6S)-leucovorin or just "leucovorin"]. But the point is that even that racemic leucovorin has been used in most of the applications that L-methylfolate has been used in. In my opinion, the key point is that both L-methylfolate and L-leucovorin are reduced folates (meaning that the hydride transfer reactions, etc., have already been "taken care of" and don't need to be performed in the body), and this is thought to allow more of a dose of a reduced folate, in comparison to a dose of folic acid, to enter the brain. Reduced folates also tend to have higher bioavailabilities than folic acid does, in my opinion, although researchers keep coming out with articles that compare tiny dosages of folic acid to tiny dosages of reduced folates and that sometimes show equivalent bioavailabilities. These articles seem to only confuse people and perpetuate the problems with inadequate dosages, neglect of the importance of cobalamin (vitamin B12) repletion, etc.
I'm just going to say this, and, if anyone's reading this blog, hopefully communicate my subjective sense of things. I don't have time to go through a lot of citations, and the research on folates (and forms of vitamin B12) and cancer prevention seems to still be stuck, somewhat, in problems with dosages and bioavailability, related to the use of folic acid (as only one factor explaining the problems), and so on. But the DNA damage that can result from depletion of intracellular folates tends to be very severe, and there's a lack of appreciation for how severe it is. More importantly, the uracil misincorporation into DNA that results from intracellular depletion of folates is only the "first layer" of damage. Uracil misincorporation can lead to double-strand breaks of DNA, and double-strand breaks can lead to large deletions. In my opinion, maximizing the folate cycle in a systematic and, arguably, pharmacologically-aggressive manner is likely to be of significant importance for strategies aimed at cancer prevention. In a person who has previously had cancer or who has major risk factors for certain cancers, however, something like a reduced folate could, in my opinion, conceivably increase the risk of cancer recurrence or produce other, undesirable effects. A person should always check with his or her doctor before taking any nutritional supplements, given that a person's individual medical history can drastically change the effects that folates or other supplements might have. Reduced folates could increase angiogenesis, or new blood vessel growth, around a nascent tumor or could, in my opinion, in a manner that would depend on the rates of cell proliferation and the intracellular folate levels and the cellular context, increase tumor growth. I know some would argue that I'm "getting ahead of myself" by expressing my subjective opinion about the role that the thymidylate cycle or DNA methylation or histone methylation may play in the development of some cancers, but here's a crude search on "breast cancer" and "prevent" and (folate OR folic OR tetrahydrofolate) that gives 15,700 results (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22breast+cancer%22+prevent+folate+OR+folic+OR+tetrahydrofolate). I just do that to give anyone reading this a sense of things. I could go through all the mechanisms and important articles and flaws in all the studies using microgram dosages of folic acid that never have a hope of elevating the levels of intracellular total folates in mammary epithelial cells or colonocytes, in the colon, or other cell types. But I've discussed the problems with all these aspects of the pharmacology of folates in many other postings.
I remember reading an article that discussed the likelihood that different factors, such as environmental toxins or nutrient depletion, might contribute to the development of cancers in people. Although avoidable radiation exposure (avoidable exposure to sources of *radiation*), in my opinion, could be one factor that could become important for some people (I'm not talking about "solar radiation" in this case, but I guess everyone knows that that is important to be aware of, too, and doesn't need to hear me repeat the standard refrain on that). But, at least in terms of the comparison of environmental toxins to the depletion of things like intracellular folates (and the article didn't even refer to them specifically, as I recall), the authors of the article concluded that environmental toxins were unlikely, in many people, to accumulate to an extent that would be significant enough to produce major DNA damage or other changes that could result in cancers. An environmental toxin has to really build up and be ingested for a long time, to produce cancers. I'm not discounting the damaging effects of airborne pollution or methylmercury in the food supply, etc. I'm just saying that, in my opinion, many of these mechanistic explanations for the ways in which environmental toxins would produce cancers are not plausible, in the sense that the chemicals either are not concentrated enough or do not exert consistent-enough effects on cells that they do become concentrated in. But it is well-known, in my opinion, that intracellular folate depletion can lead to significant DNA damage and potentially predispose to the development of cancers by altering DNA methylation and histone methylation, producing uracil misincorporation and secondary, more severe forms of DNA damage, etc. It is clear to me that there are still major problems with the interventions using miniscule dosages of folic acid or giving inadequate attention to bioavailability issues. I'm not making any kinds of claims or guarantees, and all I can offer is my opinion on this type of thing. Obviously, a person should talk with his or her doctor before taking any supplements and should make sure that his or her past medical history is not going to make the intake of reduced folates or methylcobalamin unsafe, etc. Obviously, even the most systematic and pharmacologically-sound approach should not be *expected* or assumed to have the capacity to prevent any disease or cancer or anything else, and the factors leading to the development of cancers are very complex.
The significance of the potential approval of levoleucovorin is that L-leucovorin (a.k.a. L-folinic acid) is the natural diastereomer of 5-formyltetrahydrofolate, and the natural diastereomer has been shown to produce more potent effects than the mixture of natural and unnatural diastereomers have been shown to produce. In my opinion, L-leucovorin would be expected to have some safety-related advantages over racemic leucovorin (such as in people who have immunological disorders or who are predisposed to experiencing allergic reactions), even though many people obviously have taken racemic leucovorin safely. The leucovorin that is available now is half (6R)-leucovorin and half (6S)-leucovorin [it's "racemic leucovorin" and is also called D,L-leucovorin or D,L-folinic acid or (6R,6S)-leucovorin or just "leucovorin"]. But the point is that even that racemic leucovorin has been used in most of the applications that L-methylfolate has been used in. In my opinion, the key point is that both L-methylfolate and L-leucovorin are reduced folates (meaning that the hydride transfer reactions, etc., have already been "taken care of" and don't need to be performed in the body), and this is thought to allow more of a dose of a reduced folate, in comparison to a dose of folic acid, to enter the brain. Reduced folates also tend to have higher bioavailabilities than folic acid does, in my opinion, although researchers keep coming out with articles that compare tiny dosages of folic acid to tiny dosages of reduced folates and that sometimes show equivalent bioavailabilities. These articles seem to only confuse people and perpetuate the problems with inadequate dosages, neglect of the importance of cobalamin (vitamin B12) repletion, etc.
I'm just going to say this, and, if anyone's reading this blog, hopefully communicate my subjective sense of things. I don't have time to go through a lot of citations, and the research on folates (and forms of vitamin B12) and cancer prevention seems to still be stuck, somewhat, in problems with dosages and bioavailability, related to the use of folic acid (as only one factor explaining the problems), and so on. But the DNA damage that can result from depletion of intracellular folates tends to be very severe, and there's a lack of appreciation for how severe it is. More importantly, the uracil misincorporation into DNA that results from intracellular depletion of folates is only the "first layer" of damage. Uracil misincorporation can lead to double-strand breaks of DNA, and double-strand breaks can lead to large deletions. In my opinion, maximizing the folate cycle in a systematic and, arguably, pharmacologically-aggressive manner is likely to be of significant importance for strategies aimed at cancer prevention. In a person who has previously had cancer or who has major risk factors for certain cancers, however, something like a reduced folate could, in my opinion, conceivably increase the risk of cancer recurrence or produce other, undesirable effects. A person should always check with his or her doctor before taking any nutritional supplements, given that a person's individual medical history can drastically change the effects that folates or other supplements might have. Reduced folates could increase angiogenesis, or new blood vessel growth, around a nascent tumor or could, in my opinion, in a manner that would depend on the rates of cell proliferation and the intracellular folate levels and the cellular context, increase tumor growth. I know some would argue that I'm "getting ahead of myself" by expressing my subjective opinion about the role that the thymidylate cycle or DNA methylation or histone methylation may play in the development of some cancers, but here's a crude search on "breast cancer" and "prevent" and (folate OR folic OR tetrahydrofolate) that gives 15,700 results (http://scholar.google.com/scholar?num=100&hl=en&lr=&q=%22breast+cancer%22+prevent+folate+OR+folic+OR+tetrahydrofolate). I just do that to give anyone reading this a sense of things. I could go through all the mechanisms and important articles and flaws in all the studies using microgram dosages of folic acid that never have a hope of elevating the levels of intracellular total folates in mammary epithelial cells or colonocytes, in the colon, or other cell types. But I've discussed the problems with all these aspects of the pharmacology of folates in many other postings.
I remember reading an article that discussed the likelihood that different factors, such as environmental toxins or nutrient depletion, might contribute to the development of cancers in people. Although avoidable radiation exposure (avoidable exposure to sources of *radiation*), in my opinion, could be one factor that could become important for some people (I'm not talking about "solar radiation" in this case, but I guess everyone knows that that is important to be aware of, too, and doesn't need to hear me repeat the standard refrain on that). But, at least in terms of the comparison of environmental toxins to the depletion of things like intracellular folates (and the article didn't even refer to them specifically, as I recall), the authors of the article concluded that environmental toxins were unlikely, in many people, to accumulate to an extent that would be significant enough to produce major DNA damage or other changes that could result in cancers. An environmental toxin has to really build up and be ingested for a long time, to produce cancers. I'm not discounting the damaging effects of airborne pollution or methylmercury in the food supply, etc. I'm just saying that, in my opinion, many of these mechanistic explanations for the ways in which environmental toxins would produce cancers are not plausible, in the sense that the chemicals either are not concentrated enough or do not exert consistent-enough effects on cells that they do become concentrated in. But it is well-known, in my opinion, that intracellular folate depletion can lead to significant DNA damage and potentially predispose to the development of cancers by altering DNA methylation and histone methylation, producing uracil misincorporation and secondary, more severe forms of DNA damage, etc. It is clear to me that there are still major problems with the interventions using miniscule dosages of folic acid or giving inadequate attention to bioavailability issues. I'm not making any kinds of claims or guarantees, and all I can offer is my opinion on this type of thing. Obviously, a person should talk with his or her doctor before taking any supplements and should make sure that his or her past medical history is not going to make the intake of reduced folates or methylcobalamin unsafe, etc. Obviously, even the most systematic and pharmacologically-sound approach should not be *expected* or assumed to have the capacity to prevent any disease or cancer or anything else, and the factors leading to the development of cancers are very complex.
Subscribe to:
Comments (Atom)