瀏覽代碼

x86, cmpxchg: Move 32-bit __cmpxchg_wrong_size to match 64 bit.

Signed-off-by: Jeremy Fitzhardinge <jeremy.fitzhardinge@citrix.com>
Link: http://lkml.kernel.org/r/4E5BCC40.3030501@goop.org
Signed-off-by: H. Peter Anvin <hpa@linux.intel.com>
Jeremy Fitzhardinge 14 年之前
父節點
當前提交
416185bd5a
共有 1 個文件被更改,包括 1 次插入2 次删除
  1. 1 2
      arch/x86/include/asm/cmpxchg_32.h

+ 1 - 2
arch/x86/include/asm/cmpxchg_32.h

@@ -9,6 +9,7 @@
  */
 
 extern void __xchg_wrong_size(void);
+extern void __cmpxchg_wrong_size(void);
 
 /*
  * Note: no "lock" prefix even on SMP: xchg always implies lock anyway.
@@ -84,8 +85,6 @@ static inline void set_64bit(volatile u64 *ptr, u64 value)
 		     : "memory");
 }
 
-extern void __cmpxchg_wrong_size(void);
-
 /*
  * Atomic compare and exchange.  Compare OLD with MEM, if identical,
  * store NEW in MEM.  Return the initial value in MEM.  Success is