Carbide Rod Grades for End Mills & Drills: What Changes with Grain Size and Binder %
Conclusions
● Grain size and binder % are the two biggest “knobs”in carbide rod blanks. In plain terms: finer grains push hardness/wear up; higher binder pushes toughness/edge security up.
● If you grind sharp edges (small end mills, tight corner radii, thin web drills), start with an extra-fine/submicron microstructureand then adjust binder for edge security rather than jumping straight to a coarser grade.
● Don’t compare grades without checking the test method: hardness for hardmetals is commonly specified via Rockwell A; strength is often given as transverse rupture strength (TRS). Ask suppliers to state the standard used.
● When machining “hot” materials like titanium, tool substrates are typically selected for hot hardness and chemical stability; guidance often points toward low cobalt content and fine-grained uncoated carbide as a baseline substrate choice.
● For a reality check, look at a published rotary-tool cemented-carbide example: Sandvik’s H10F is described as submicron (~0.8 µm) with 10% Co and published property values (e.g., ~92.1 HRA, TRS ~4300 MPa). Use it as a reference point when you read datasheets.
My rule: pick grain size for wear + grindability first, then “tune” binder % for edge toughness and reliability.
The two knobs: WC grain size & binder %
When a toolmaker asks me for a “better carbide rod” without changing geometry or coating, I usually ask two questions: what WC grain size are you using? and what binder percentage? In straight WC–Co hardmetals, those two variables dominate how a blank balances hardness/wear resistance versus strength/toughness. Sandvik’s hardmetals handbook summarizes this trade-off directly: submicron grain sizes with lower binder are associated with high hardness and compressive strength, while higher binder with larger grains trends toward higher strength and toughness.
| Common grain-size class (example classification) | WC grain size range (µm) | Why this matters for tool blanks |
| Ultra fine | < 0.5 | Excellent edge wear resistance potential; demands good processing control; often chosen where edge sharpness and abrasion dominate. |
| Extra fine | 0.5 – 0.9 | Common “workhorse” zone for rotary tool substrates: balance of hardness + edge strength when binder is tuned properly. |
| Fine | 1.0 – 1.3 | More forgiving for impact and edge chipping; can suit tougher drilling or interrupted cuts. |
| Medium and coarser | ≥ 1.4 | Typically moves further toward toughness/impact resistance at the cost of wear resistance and edge retention. |
What grain size really changes
1) Edge wear vs edge chipping is largely a microstructure question
Tool edges fail in more than one way. If you’re mainly fighting flank wear and abrasion (e.g., abrasive cast irons, fiber-filled composites, certain powdered metals), a finer WC grain size is often the first lever you pull because it supports higher hardness and wear resistance at a given binder level. If you’re fighting micro-chipping, intermittent impact, or unpredictable chip load (common in drilling and in small end mills with thin cores), you may need more “edge security”—which can come from binder tuning or moving slightly coarser, depending on your constraints.
2) Grindability and edge sharpness
For tool grinders, a practical reality is that the blank has to survive grinding without edge crumble, and it has to hold a keen edge after grinding. Submicron structures are frequently used for rotary tools because they support sharp edge profiles—this is also explicitly noted in published grade descriptions for rotary-Tool Carbides.
3) “Grain size” is only meaningful if it’s measured consistently
I’ve seen teams compare a “0.8 µm” grade from one supplier with a “0.8 µm” grade from another and assume they’re the same. That’s risky. ISO 4499-2 explains that the most direct measurement involves polishing/etching and then quantitative metallography, and it outlines different ways to define mean grain size (length/area/volume) while recommending the linear-intercept technique for hardmetals. If you want true apples-to-apples comparisons, ask the supplier to state the method and to provide representative micrographs.
What binder % really changes
1) Toughness/strength vs hardness: the classic trade
In WC–Co, the binder phase is the ductile “glue” that holds the hard WC grains together. Increase binder and you generally gain toughness/strength, but you typically give up hardness. Sandvik’s handbook describes ranges where (for example) binder contents around 10–20% with WC grain sizes ~1–5 µm are associated with high strength/toughness, while lower binder (e.g., 3–15%) combined with submicron grain sizes is associated with high hardness and compressive strength.
2) Heat, chemistry, and “what the workpiece does to your tool”
In high-heat or chemically aggressive cutting environments, substrate selection often emphasizes hot hardness and chemical stability. For example, Sandvik’s titanium machining guidance notes that cutting tool materials should have good hot hardness and low cobalt content, and that fine-grained, uncoated carbide is usually used. The point isn’t that every titanium job must use the same binder %—it’s that binder choice is not just “toughness”; it also changes how the tool behaves at temperature.
3) Binder % affects quality-control signals (and that’s useful)
If your supplier measures magnetic properties like coercivity, that can be an early-warning indicator for binder chemistry/microstructure consistency in ferromagnetic-binder hardmetals. ISO 3326 defines a method for determining magnetization coercivity for hardmetals containing not less than 3% ferromagnetic binder by mass. (If you’re buying rods at scale, this kind of traceable QC is often more actionable than marketing claims.)

End mills vs drills: how the “same grade” behaves differently
Here’s the part that gets missed: the tool design changes how the substrate gets stressed. An end mill edge is often dominated by cyclic engagement (milling entries/exits) and can be optimized for sharpness + wear resistance. A drill has a chisel edge, a web, and long contact length; it is far more sensitive to edge chipping, heat, and chip evacuation issues. So even if two tools use “the same grade,” one may succeed and the other may chip—because the stress state is different.
Practical selection heuristics I use (starting points, not universal rules)
● Small end mills, tight radii, fine finishing:start with an extra-fine/submicron grain structure; keep binder moderate and rely on geometry/coating for strength where possible.
● General-purpose end mills:extra-fine to fine grain; tune binder upward if your failure mode is edge chipping rather than wear.
● Drills (especially deeper holes / unstable setups):prioritize edge security first (binder tuning, then grain-size adjustment); coolant-hole rods can help heat/chip control, but the substrate still needs toughness margin.
● Difficult-to-machine materials (e.g., titanium):start with fine-grained substrate logic and avoid “too much cobalt” as a default—then validate with real cutting tests for your geometry and coolant strategy.
FAQ
What grain size is typically used for solid carbide end mills?
End mills commonly use fine to extra-fine/submicron WC grain sizes because they support sharp edge profiles and strong wear resistance potential. A referenced classification lists “extra fine” as 0.5–0.9 µm and “ultra fine” as <0.5 µm, which are frequently discussed in the context of cutting tools.
What grain size is typically used for carbide drills?
Many drills also use fine/extra-fine structures, but drills often need more edge security due to their stress state (chisel edge, long contact length). In practice, you often tune binder % upward before moving much coarser—then confirm with cutting tests. Use standards-based grain-size reporting to compare suppliers.
Does higher cobalt always mean better performance?
Not always. Higher binder (e.g., cobalt) is associated with higher strength/toughness, but it generally reduces hardness. Selection depends on your dominant failure mode (wear vs chipping) and the workpiece/temperature regime.
What test methods should I look for on a tungsten carbide rod certificate?
For hardmetals: Rockwell A hardness testing is specified in ISO 3738-1; transverse rupture strength (TRS) is specified in ISO 3327; coercivity measurement is specified in ISO 3326; and metallographic WC grain size measurement guidance is in ISO 4499-2.
Why do “submicron” rods from different suppliers behave differently?
Because “submicron” can be measured/defined differently, and because distribution, porosity, binder chemistry, and process control (e.g., preparation/etching for metallography) influence real properties. ISO 4499-2 notes that preparation/etching techniques are important and describes multiple ways to define mean grain size.
References
● Sandvik Hard Materials handbook/guide: notes on WC–Co grade classification by cobalt content and WC grain size, and grain-size classification ranges (ultra fine to extra coarse).
● ISO 4499-2:2020 — Hardmetals: Metallographic determination of microstructure — Part 2: Measurement of WC grain size (methods, definitions, and recommendation of linear-intercept).
● Sandvik H10F datasheet for rotary tools (submicron ~0.8 µm; 10% Co; published hardness and TRS values).
● ISO 3738-1 — Hardmetals: Rockwell hardness test (scale A) — Part 1: Test method.
● ISO 3327 — Hardmetals: Determination of transverse rupture strength (TRS).
● ISO 3326:2013 — Hardmetals: Determination of (the magnetization) coercivity.
● Sandvik Coromant guidance (titanium machinability): emphasizes hot hardness, low cobalt content, and fine-grained uncoated carbide as a usual choice.











