简体版 繁體版 English 한국어
登録 ログイン

prefetchingの例文

例文モバイル版携帯版

  • In recent years, many high-performance processors use prefetching techniques.
  • The extension also provides an optional controversial feature which forces link prefetching.
  • This allows both prefetching and traditional instruction-level parallelism.
  • In particular, he advocates the use of age oriented collectors and prefetching.
  • Prefetching occurs when a processor requests an instruction or data block from memory wall issue.
  • Explicitly parallel instruction computing ( EPIC ) is like VLIW, with extra cache prefetching instructions.
  • Another pattern of prefetching instructions is to prefetch addresses that are s addresses ahead in the sequence.
  • Increasing block size leads to prefetching of nearby words in a block and preventing future cold misses.
  • Jumps ( conditional or unconditional branches ) interfere with the prefetching of instructions, thus slowing down code.
  • Increasing the block size too much can lead to prefetching of useless data, thus increasing the number of cold misses.
  • However, since up to four instructions were packed into each 60-bit word, some degree of prefetching was inherent in the design.
  • "' Application prefetching "'works in a similar fashion, but is instead localized to a single application's startup.
  • Konqueror offers increased loading speed by prefetching domain name data in KHTML . A find-as-you-type bar improves navigation in webpages.
  • Revocation improvements include native support for the Online Certificate Status Protocol ( OCSP ) providing real-time certificate validity checking, CRL prefetching and CAPI2 Diagnostics.
  • Out-of-order execution and on-chip prefetching reduce the memory latency bottleneck at the expense of using more transistors and increasing the processor complexity.
  • In these games, data required for an upcoming level is loaded into memory in the background as the player approaches it, a process known as prefetching.
  • "' Prefetching "'in computer science is a technique for speeding up fetch operations by beginning a fetch operation whose result is expected to be needed soon.
  • Usually this is before it is " known " to be needed, so there is a risk of wasting time by prefetching data that will not be used.
  • Although these are a very small fraction of the total number of misses we might see without any prefetching, this is still a small non-zero finite number of misses.
  • New information presents improvements in multithreading, resilency improvements ( Intel Instruction Replay RAS ) and few new instructions ( thread priority, integer instruction, cache prefetching, data access hints ).
  • もっと例文:  1  2