Shipboard IW analyses were performed on 5- to 15-cm-long whole-round sections that were cut immediately after core retrieval on deck. Details of the sampling resolution are described in the individual site chapters. After extrusion from the core liner the surface of each whole round was carefully scraped with a spatula to remove potential contamination. Interstitial waters were collected using a titanium squeezer modified after the standard stainless steel squeezer of Manheim and Sayles (1974). After loading the squeezer pore water was extruded through prewashed Whatman no. 1 filters fitted on a titanium screen by applying pressures as high as 40,000 lb (~4150 psi) using a hydraulic press.
Interstitial water was collected into acid-washed (10% HCl) 50-mL plastic syringes through 0.45-µm Gelman polysulfone disposable filters. Samples for shipboard work were stored in plastic vials before analysis.
Interstitial water samples were routinely analyzed for salinity of total dissolved solids using a Goldberg optical handheld refractometer (Reichart). The alkalinity was measured by Gran titration using a Metrohm pH electrode and autotitrator. The pH was measured on the National Bureau of Standards scale as part of the alkalinity titration. In situ electrode potential measurements were made before squeezing the whole round, using a glass combination electrode and a Metrohm portable pH meter. The electrode was calibrated against TRIS and BIS buffers to calculate pH on the free H+ scale (Gieskes et al., 1991). The pH determined in this fashion (ppH) was more reliable than that obtained during the alkalinity titration because the algorithm employed for pH measurement before the start of the alkalinity titration is adversely affected by degassing before the alkalinity measurements.
Dissolved chloride was determined by titration using the method of Gieskes et al. (1991). Silica, phosphate, and ammonium were determined by spectrophotometric methods using a Milton Roy Spectronic 301 spectrophotometer (Gieskes et al., 1991). The standard deviations for the analyses are given in Table T6.
Concentrations of sodium, potassium, magnesium, calcium, chloride, and sulfate were analyzed by ion chromatography using a Dionex DX-120. Standard deviations are given in Table T6. Measurements of chloride using ion chromatography were systematically higher by ~3%-5% than those obtained by titration, hence only titration results are reported. As a result of the interference of H2S with the analysis of Cl- by titration, all samples showing high concentrations of H2S were treated with 100 µl of 30% H2O2 five min before analysis (Shipboard Scientific Party, 1997).
Concentrations of iron,
lithium, and strontium were quantified using flame atomic emission (AES) or
absorption (AAS) spectrometry on a Varian Spectra AA-20. Iron was determined
directly in the alkalinity titration residues. Air-acetylene (Fe, Li) and
nitrous oxide acetylene (Sr) flames were utilized. Standards for all flame AAS/AES
techniques were matched in matrix composition to the samples (Li, Sr) or
prepared in synthetic seawater (Fe). A more detailed description of all methods
and standards for all analyses used can be found in Gieskes et al. (1991). The
1- standard deviations
were ~2% for lithium and ~3%-4% for strontium.
Mineralogy was determined
on solid carbonate samples using X-ray diffraction (XRD). Quantitative XRD
analyses were performed on bulk samples to determine the relative percentage of
aragonite, calcite, quartz, and dolomite. Samples were run in batches of 20 and
scanned from 25° to 35°, counting for 1.0 s at 0.02° 2
steps. To overcome the limitations of using multicomponent standards, conversion
from peak areas to mineral weight percent was accomplished using the H-factor
method of Hooten and Giorgetta (1977), modified to use low-Mg calcite as the
common internal standard. In this method the areas of the peaks of interest were
obtained relative to the calcite peak and calibrated using calibration curves
from a series of two-component standards. The final weight percent of each
mineral was adjusted to the appropriate carbonate concentration measured on the
same sample (see "Organic
Geochemistry"). Overall, the accuracy of the XRD analysis is
within 5% actual weight percent with a standard deviation of 3%.