Pages
- Home
- Cloud & Edge Computing
- Machine Learning & AI
- Microprocessors & Servers
- HDI Packaging Technologies
- Packaging Technologies (CSP, FCXGA, POP)
- Project Management 101
- Six Sigma & Statistics
- Reliability Engineering
- Package Design / Product Development
- 3DIC.ORG [Fan Out]
- 3DIC.ORG [2.5D/3D]
- Semi Accurate
- The Inquirer
- Feedly-Tech
- Semiconductor Packaging
- Advanced Packaging
- Silicon Far East
- Post Silicon Platform Validation
- Online Articles -AI/ML
- Online Articles - Packaging
Search This Blog
Friday, October 11, 2013
Sunday, July 14, 2013
Smart enough for your phone?
"ABCs of smartphone screens: 1080p and more (Smartphones Unlocked) | Dialed In - CNET Blogs" http://feedly.com/k/12NYwaZ
Thursday, May 16, 2013
Tuesday, May 14, 2013
Monday, May 13, 2013
Sunday, May 12, 2013
Friday, May 10, 2013
More Memories
Dual Data Rate Synchronous RAM (DDR SDRAM)
Single data rate Synchronous RAM (SDRAM)
Proprietary memory modules
L1 cache - Memory accesses at full microprocessor speed (10 nanoseconds, 4 kilobytes to 16 kilobytes in size)
L2 cache - Memory access of type SRAM (around 20 to 30 nanoseconds, 128 kilobytes to 512 kilobytes in size)
Main memory - Memory access of type RAM (around 60 nanoseconds, 32 megabytes to 128 megabytes in size)
Hard disk - Mechanical, slow (around 12 milliseconds, 1 gigabyte to 10 gigabytes in size)
Internet - Incredibly slow (between 1 second and 3 days, unlimited size)
Courtesy of HSW: "http://www.howstuffworks.com/"
Thursday, May 9, 2013
uP Timeline
1972 Intel 8008 The first 8-bit processor, the 8008 had an address space of 16KB and was clocked at 500KHz up to 800KHz
1974 Intel 8080 The 8080 was a significant step up, boasting a clock speed of 2MHz and able to address 64KB memory. Early desktop computers used this chip and the CP/M operating system
1976 Zilog Z80 Zilog was founded by ex-Intel engineers who created a compatible but superior chip to the 8080. The Z80 powered many CP/M machines, plus home computers like the ZX Spectrum
1978 Intel 8086 Famous as the first x86 chip, the 8086 was also Intel’s first 16-bit chip with about 29,000 transistors and was clocked initially at 4.77MHz
1982 Intel 80286 The 80286 was a high-performance upgrade of the 8086, and used by IBM in the PC-AT. First clocked at 6MHz, later versions ran up to 25MHz. The 286 had a 16MB address space and 134,000 transistors. 1985 Intel 80386 Intel’s first 32-bit chip, the 386 had 275,000 transistors – over 100 times that of the 4004. Versions of the 386 eventually reached 40MHz
1985 Acorn ARM produced as co-processor for BBC Micro Seeking a new chip to power future business computers, the makers of the BBC Micro decided to build their own, calling it the Acorn RISC Machine (ARM)
1987 Sun SPARC Like Acorn, Sun was looking for a new chip and decided to create its own. The Sparc architecture is still used today in Sun (now Oracle) systems, and supercomputers
1989 Intel 80486 A higher performance version of the 386, Intel’s 486 was the first x86 chip with over 1 million transistors (1.2 million). It was also the first with an on-chip cache and floating point unit
1990 IBM RS/6000 introduces Power chips IBM experimented with RISC chips in the 1970s, and this bore fruit with the RS/6000 workstation in 1990. The processor later developed into the Power chip used by IBM and Apple
1993 Intel Pentium The Pentium was a radical overhaul of Intel’s x86 line, introducing superscalar processing. Starting at 60MHz but eventually reaching 300MHz, the Pentium had 3,100,000 transistors
1995 Intel Pentium Pro Developed as a high-performance chip, the Pentium Pro introduced out-of-order execution and L2 cache inside the same package. This line later morphed into the Xeon line
1996 AMD K5 AMD had been manufacturing Intel chips under licence for years, but the K5 was its first in-house design, intended to compete with the Pentium
1999 AMD Athlon The AMD Athlon was the firm’s first processor that could beat Intel on performance. Starting at 500MHz, a later version was the first x86 chip to hit 1GHz and had 22 million transistors
2000 Intel Pentium 4 Another major redesign, the Pentium 4 introduced Intel’s Netburst architecture. It was clocked at 1.4GHz initially, rising to 3.8GHz, and had 42 million transistors
2001 Intel Itanium Developed by Intel and HP, Itanium is a 64-bit non-x86 architecture developed for parallelism and aimed at enterprise servers. The Itanium family has not been a great success
2002 TI Omap ARM TI became one of the largest makers of system-on-a-chip devices for smartphones and PDAs with the Omap family, combining an ARM CPU with circuitry such as GSM processors
2003 Intel Pentium-M (Centrino) The Pentium-M was designed specifically for laptops, and formed the core of Intel’s first Centrino platform. It had 77 million transistors and was clocked from 900MHz
2003 AMD Opteron While Intel laboured with Itanium, AMD introduced the first 64-bit x86 chips with the Opteron, which proved popular in workstations and servers. It had over 105 million transistors
2005 Intel Pentium-D Intel introduced its first dual-core chips in 2005, starting with the Pentium Extreme Edition. The Pentium D was the first mainstream desktop chip to follow suit
2006 AMD acquires ATI AMD bought up ATI, announcing ambitious plans to combine its x86 processors with ATI’s graphics processors
2006 Intel Xeon 5300 Intel‘s first quad-core chips were the Xeon 5300 line for workstations and servers. Actually two dual-core dies joined together, these have a total of 582 million transistors
2008 Qualcomm SnapDragon ARM Wireless technology firm Qualcomm started producing highperformance smartphone chips based on the ARM architecture. SnapDragon is clocked at 1GHz and has 200 million transistors
2011 Intel Core i3,i5, i7 Intel’s latest chips, based on the Sandy Bridge architecture. The desktop processors have up to eight cores on a single chip and up to 995 million transistors
2011 AMD Fusion chips The Fusion line combines multiple CPU cores on a single chip along with ATI GPU cores, with the first chips having up to 1.45 billion transistors
2011 ARM announces ARMv8 64-bit architecture ARM unveils its specifications for future 64-bit chips. Although some years away, products based on ARMv8 could have as many as 128 cores
-excerpted from "40 years of the microprocessor" published in the Inquirer http://www.theinquirer.net/
Monday, April 29, 2013
Failure Analysis Techniques: Resolution
Failure Analysis: Tools & Techniques
Topography: SEM (low voltage, inelastic collisions, higher resolution, low contrast) /BSE (high voltage, elastic collisions, lower resolution, high contrast)
Morphology: (lattice geometry, crystallographic structure) EBSD/TEM/AFM/STM
Material analysis:
Elemental: EDX/WDX/XRF
Chemical: (structural bonds, oxidation states) AES/XPS/EELS/SIMS/FTIR
Interaction between primary electrons & matter: SEM, TEM, BSE, EBSD, EDX & WDX
Interaction between primary X-Rays & matter: XPS, AES, XRF
Other techniques: Opticals, X-Rays, CSAM, Curve Trace, TDR, IR & thermal imaging, SQUID, LSM (LIVA/OBIC for opens & TIVA/OBIRCH for shorts), x-sections, P-laps & FIB cuts
Making Sense of Physics-of-Failure Based Approaches
1. Study of the hardware configuration: geometry, design, materials, structure
2. Study of life cycle loads: operational loads (power, voltage, bias, duty cycle) & environmental loads (temperature, humidity, vibration, shock)
3. Stress analysis: Stress-strength distributions/interference, cumulative damage assessment & endurance interference, FMEA, hypothesize failure mechanisms, failure sites & associate failure models, root cause analysis, calculate RPN's to rank & prioritize failures.
4. Reliability assessment: Rel metrics characterization, life estimation, operating/design margin estimation.
5. Interpret & apply results: Design tradeoffs & optimization, ALT planning & development, PHM & HUMS planning.
Why is the Exponential Distribution special?
2. Constant failure rate (or, hazard rate = lambda) -> used to model useful life portion of the bathtub curve.
3. R(t+T) = R(t)
4. A 3-parameter Weibull(eta, beta, gamma) is the same as a 2-parameter exponential (with beta = 1 & eta = MTTF = 1/lambda).
5. A 1-parameter Weibull (eta, beta=1, gamma=0) is the same as 1-parameter exponential (with eta = MTTF = 1/lambda)
6. R(t=MTTF) = 36.8% & Q(t=MTTF) = 63.2%.
Hypothesis Tests: How?
2. Develop null & alternate hypotheses
3. Set up test parameters (1-sided v/s 2-sided, choose distribution & significance level or alpha)
4. Calculate test statistic & corresponding p-value
5. Compare p-value with alpha & interpret results
Hypothesis Tests : Which & When?
1-sample or 2-sample: Use z-test for n>=30 or when population variance is known, else use t-test
> 2-samples: Use ANOVA
Test of Variances:
1-sample: Use Chi-square test
2-samples: Use F-ratio test
Test of Proportions:
1-sample or 2-sample: Use z-test
>2-samples: Use Chi-square test
Distributions
Hypergeometric: Probability of r rejects in n sample size for N population size with d total rejects. (Intended for small, finite, well characterized populations)
Binomial: Probability of r rejects in n sample size, where n < 10% of N population size, where chance of success in any given trial always stays the same (p)(Intended for large population sizes)
Poisson: Probability of r rejects (=defects or events) in infinite population size, for a given failure rate (lambda). (Intended for n->infinity & p->0)
Binomial distribution approximates Hypergeometric distribution for large N.
Poisson distribution approximates Binomial distribution when N tends to infinity.
Distributions for Continuous Data: Normal, Lognormal, Exponential, Weibull
SPC/Control Charts
For variable data, use I-MR (for n=1), X(bar)-R (for n = 2 to 10) or X(bar)- s (for n>10)
For attribute data:
1. Count/proportion of defectives is estimated through binomial distribution. For constant sample size(n), estimate count of defectives using np chart, while for variable sample size, estimate proportion of defectives using p-charts.
2. Count/rate of defects is estimated through poisson distribution. For constant sample size(n), estimate count of defects using c-chart, while for variable sample size, estimate rate of defects using u-chart.
Six Sigma & Process Variation
-Approx 68% of variation is contained within +/- 1sigma
-Approx 95% of variation is contained within +/- 2sigma
-Approx 99.7% of variation is contained within +/- 3sigma
Cp = 1 when +/- 3 sigma is contained within spec limits.
Cp = 1.33 when +/- 4 sigma is contained within spec limits.
Cp = 1.50 when +/- 4.5 sigma is contained within spec limits.
Cp = 1.67 when +/- 5 sigma is contained within spec limits.
Cp = 2.00 when +/- 6 sigma is contained within spec limits.
Acceptance sampling: LTPD & AQL
LTPD = definition of a threshold bad lot.
The sampling plan is designed around the AQL/LTPD such that it defines:
1. MAX chance of ACCEPTING lots of quality that is equal or worse than LTPD. This chance/risk is BETA or CONSUMER's RISK.
2. MAX chance of REJECTING lots of quality that is equal or better than AQL. This chance/risk is ALPHA or PRODUCER's RISK.
Alpha (Probability of rejection) is usually set to 0.05. This equates to 95% chance/confidence of acceptance.
Beta (Probablility of acceptance) is usually set to 0.10. This equates to 90% chance/confidence of rejection.
Power, Confidence, Error, Significance
Accept null hypothesis when false (or false negative) = beta or Type 2 error
Reject null hypothesis when false: POWER = (1-beta)
Accept null hypothesis when true : CONFIDENCE (= 1-alpha)
At high power, beta is small => alpha is large => likely that p-value will be < alpha (significance level). Most effects tend to be deemed significant.
At low power, beta is large => alpha is small => likely that p-value will be > alpha (significance level). Most effects tend to be deemed insignificant.
Thursday, April 25, 2013
2.5/3D TSV & Silicon Interposers: Weighing Pros v/s Cons
Product Development: Womb to Tomb, Cradle to the Grave
Six Sigma : Process & Design
Process: Aims to reduce process variation
Define: Plan, scope, charter, schedule, team, objectives, milestones, deliverables
Measure: MSA, GR&R, Process Capability, Yields
Analyze: Hypothesis tests, ANOVA, PFMEA, Process Maps (KPIV/KPOV)
Improve: DoE
Control: SPC, Control Charts
Design: Aims to reduce cycle time and need for rework
Define: Plan, scope, charter, schedule, team, objectives, milestones, deliverables
Measure: Baseline, benchmark, functional parameters, specs & margins
Analyze: DFMEA, Risk analysis, GAP analysis
Develop: Deliver design
Optimize: DfX - tradeoffs
Validate: Prototype builds
Firefighting through methodical madness
1. Develop Team
2. Define Problem: Failure rate, lots affected, establish scope
3. Containment: Raise red flags, lots on hold, generate documentation, reliability assessment, sampling plans, increased checks & balances
4. Problem analysis: Process mapping, history tracking, establish commonalities & dependencies, consult FMEA, RCA/5W/5M, failure analysis, establish hypotheses, develop CAPA theories (short-term/mid-term/long-term)
5. Verify corrective actions: Engineering studies to duplicate problem and verify effectiveness of CA
6. Implement corrective action: Release lots, provide disposition, soft ramp through full release with increased sampling, document lessons learnt
7. Implement preventive action: Mid-term/long-term actions to prevent any recurrences in future
8. Congratulate team
High Density Integration schemes
PoP, SoC & Die Stacking (wire-bonded only, wire-bonded + flip-chip, TSV/TSI 2.5D/3D, F2F FC bonded die).
Whats in a SoC?
Some digital logic (CPU, GPU & chipset logic such as GNB); Memory (DDR RAM, cache); analog signal & power management (sensors, drivers, actuators, controllers), interconnect buses & interfaces (PCI, HT) and DfT structures (BIST, JTAG Boundary Scans)
An acronymously brief history of semiconductor packaging
CERDIP -> PDIP -> SOP/QFP -> BGA & FCA -> QFN -> CSP/WLP -> PoP/SiP -> SoC -> TSV/TSI
Tuesday, April 23, 2013
Subscribe to:
Posts (Atom)
Smartphone Components
Antenna + Switch & RFFE, Filter, Duplexer, Amplifier, Transceiver, Baseband, Application Processor [SOC + LPDDR3], Memory [Flash / SSD...
-
Molded Embedded Packages seek to overcome warpage concerns and limitations in z-height that conventional PoP type packages suffer from, w...
-
L-Gate: Technology/Product Development L-1: Explore / PC1 L 0: Define / PC2 & T/O L 1: Enable/BKM determination L 2: Implement/BK...
-
2 primary flows: CoS and CoW (or CoC) CoW/CoC may be chip-first (attach before interposer MEOL) or chip-last (after interposer MEOL) Ch...