Fundamental Toxicological Sciences

2024 - Vol. 11 No. 5

2024 - Vol. 11

Original Article
Relationship between the dose of intravenous self-administration and the minimum effective dose for gross behavioral effects in rhesus monkeys: opiates vs CNS depressants Vol.11, No.5, p.259-266
Kenshi Nakagawa , Atsushi Fujiwara , Masahiko Iino , Mikio Sasaki , Takahiro Ootsuka , Shin-ich Sato , Takayuki Anzai , Takaaki Matsuyama
Released: October 29, 2024
Abstract Full Text PDF[1M]

In the development of drugs that affect the central nervous system (CNS), it is important to determine whether they have dependence potential and if so, to establish an effective dose within a range that does not manifest dependency. Methods for evaluating drug dependence include self-administration tests using experimental animals while assessment of the minimum effective dose (MED) involves gross behavioral observation. This study aims to examine the relationship between the most frequently self-administered dose (peak self-administered dose, PSAD) and the MED amongst different types of CNS depressants in order to optimize dose range selection for the evaluation of drug reinforcing effects. PSAD was investigated by intravenous self-administration in rhesus monkeys conducted as daily 2 hr-sessions under a fixed ratio 5 schedule with a 1-min time-out after each administration. MED was investigated by gross behavioral observation following cumulative dosing. For opiates, the PSAD was 0.016 mg/kg/infusion for morphine, 0.06 mg/kg/infusion for codeine, 0.25 mg/kg/infusion for butorphanol and 0.063 mg/kg/infusion for pentazocine. For anesthetics and sedatives, the PSAD was 0.5 mg/kg/infusion for pentobarbital, 0.25 mg/kg/infusion for thiopental, 0.06 mg/kg/infusion for ketamine and 0.063 mg/kg/infusion for midazolam. The PSAD/MED ratio was 1/63-1/32 for opiates and 1/8-1/2 for anesthetics and hypnosedatives. While previous research by Fujiwara et al. (2016) suggested that a dose range lower than the MED for gross behavioral effects should be used for intravenous self-administration in the evaluation of drug reinforcing effects, this study further indicates that the optimal dose range may vary depending on drug type.

Data Report
Estimation of daily cadmium intake in the United States through food analysis in 1980 Vol.11, No.5, p.251-257
Katsue Suzuki-Inoue , Seiichiro Himeno , Shosuke Suzuki
Released: October 23, 2024
Abstract Full Text PDF[781K]

Cadmium is an environmental contaminant that accumulates in the human kidneys. Non-rice grains, vegetables, and potatoes are major sources of cadmium in the U.S. However, the daily cadmium intake levels reported in studies conducted in the U.S. vary widely, ranging from 4.63 to 51.3 µg/day/person. Most studies in the U.S. have not measured the actual cadmium concentrations in the collected foods but utilized the database of the average cadmium concentrations provided by the U.S. Department of Agriculture (USDA). In the present study, food and beverages were collected extensively in the U.S. from 1979 to 1980, and the actual cadmium concentrations were determined. Individual food intake data were obtained from a total diet study conducted in 1980 by the USDA. In accordance with previous reports, vegetables, grains, and potatoes were the primary sources of dietary cadmium intake. The estimated total cadmium intake was approximately 15 µg/day/person. We also found a 17-fold difference in cadmium concentrations in carrots between the production areas, which was greater than the variation reported for carrots in Japan. The factors affecting the variation in cadmium concentrations in vegetables, including carrots, need to be clarified. As the relative importance of vegetables as a source of cadmium intake is increasing in Japan due to decreased rice consumption, more attention should be paid to variations in cadmium concentrations in vegetables in Japan.

Original Article
Puberulic acid displays remarkable cytotoxicity and strong inhibitory effect on the all-trans retinoic acid-induced superoxide-generating ability in U937 cells Vol.11, No.5, p.243-249
Hidehiko Kikuchi , Kaori Harata , Takefumi Sagara , Harishkumar Madhyastha , Hitomi Mimuro , Futoshi Kuribayashi
Released: October 18, 2024
Abstract Full Text PDF[1007K]

A major health outbreak in March 2024 with renal impairments occurred by ingested food supplements made from beni-koji (red yeast rice) in Japan, which has been considered as major health issue. A troponoid compound puberulic acid (PA) that was produced by contaminated Penicillium spp. is attracting attention as a causative agent of this health disaster occurred by beni-koji (red yeast rice). Regarding toxicity, it was reported that PA showed weak cytotoxicity against fetal human lung fibroblast-like MRC-5 cells with an IC50 value of about 289 μM (Iwatsuki et al., 2011). However, understandings about the physiological effects of PA against human tissues and cells still remain poor and insufficiently studied. Therefore, in this study, we investigated the effect of PA on the viability and the all-trans retinoic acid (ATRA)-induced superoxide anion (O2-)-generating ability of human leukemia U937 cells. PA remarkably showed a strong cytotoxicity accompanied by apoptosis, which was enhanced by ATRA. Furthermore, PA dramatically down-regulated the ATRA-induced O2--generating activity in a dose-dependent manner. Quantitative RT-PCR and immunoblot analyses showed that PA significantly reduces the ATRA-induced O2--generating activity via down-regulating gene expression levels of gp91-phox, which is an essential factor for the O2--generating activity of leukocytes. These findings revealed that PA has not only the strong ATRA-enhancible cytotoxic effect but also the drastic reducing effect on the ATRA-induced O2--generating activity through down-regulating transcription of gp91-phox gene. We expect that our findings will contribute to resolve the large-scale health disaster caused by beni-koji (red yeast rice).

Letter
Hinokitiol and pyrrolidone carboxylate zinc or corn oligosaccharides: A Synergistic approach to combating scalp microorganisms in seborrheic dermatitis Vol.11, No.5, p.233-241
Akihiro Michihara , Hiroshi Matsuoka , Junichi Fujii , Chiharu Furukawa , Xianting Lin , Jianzhong Yang
Released: September 19, 2024
Abstract Full Text PDF[1M]

Seborrheic dermatitis (SD) is a prevalent condition that results in dandruff, itching, and discomfort, and affect approximately 3–10% of the general population. Proliferation of the genus Malassezia, a microorganism inhabiting the scalp, is considered a contributing factor. Despite reducing the Malassezia population, other diseases, including SD, may still develop due to an increase in Staphylococcus aureus, which is associated with atopic dermatitis (AD) and SD, or a decrease in Staphylococcus epidermidis, which produces glycerol (moisturizer) and inhibits S. aureus growth. Therefore, we investigated the concentrations of anti-microbial reagents (pyrrolidone carboxylate-zinc [PCA-Zn] and Hinokitiol) and malt oligosaccharides (MT: corn-derived oligosaccharide mainly containing maltotetraose) that inhibited or promoted the growth of three types of scalp microorganisms. Individually, 0.50–1.00 mM PCA-Zn or 0.05–0.20 mM Hinokitiol displayed a marked growth-inhibitory effect on Malassezia furfur without a decline in S. epidermidis or an increase in S. aureus. Conversely, 0.02% MT individually exerted a growth effect on S. epidermidis but not on M. furfur. We then examined the effects of a mixture of the above-mentioned reagents on scalp-resident microorganisms. Our results indicated that 0.10 mM or 0.20 mM Hinokitiol combined with 0.02% MT markedly inhibited M. furfur growth and were the most effective at increasing S. epidermidis or decreasing S. aureus, compared to the single or combined effects of other reagents. Overall, our study provides valuable information on Hinokitiol and oligosaccharides concentrations in mixtures for use in shampoo-type cosmetics, and quasi-drugs, to prevent and treat SD.

Original Article
Safety evaluation of exomaltotetraohydrolase from Pseudomonas stutzeri Vol.11, No.5, p.215-231
Shuji Matsumoto , Alan B. Richards
Released: September 10, 2024
Abstract Full Text PDF[853K]

Exomaltotetraohydrolase (G4ase) catalyzes the hydrolysis of (1->4)-α-D-glucosidic linkages in amylaceous polysaccharides from the non-reducing ends removing successive maltotetraose residues. A safety assessment was conducted for G4ase produced by the non-genetically modified strain of Pseudomonas stutzeri, MO-19. Two standardized acute oral toxicity studies using female rats were performed on G4ase having a TOS not determined for the first study and 7.13% TOS for the second. The 50% lethal dose (LD50) of G4ase was determined to be more than 2000 mg/kg, corresponding to more than 143 mg-TOS/kg. A 2-week oral repeated toxicity study in rats at 1000 mg/kg/day (highest dose; TOS not determined) demonstrated no treatment related toxicity and was used to identify the appropriate dose for a 90-day study. Results of a standardized 90-day oral repeated toxicity study (gavage) of G4ase (7.13%-TOS) using rats demonstrated that the No Observed Adverse Effect Level (NOAEL) of G4ase was 1000 mg/kg/day (the highest dose), corresponding to 71.3 mg-TOS/kg. Four standardized genotoxicity studies of bacterial reverse mutation, chromosomal aberration, and in vivo and in vitro micronucleus tests were performed on G4ase (5.19, 5.19, 7.13 and 6.65%-TOS, respectively). It was concluded that G4ase did not induce gene mutation in Salmonella typhimurium and Escherichia coli, did not induce chromosomal aberrations in cultured mammalian cells, and did not induce micronucleated erythrocytes in rat bone marrow cells or human spleen cell line lymphoblasts. Taken together these data indicate that G4ase from P. stutzeri strain MO-19 is safe for use as a processing aid in manufacturing food for human consumption.